Oracle Fusion Middleware – “Concepts of Oracle Coherence”

Hi all, if you are dealing with large applications with loads of write and read operations then each call to database will cost you so much in the performance prospective. Thinking of the above point now a days cache driven applications are given more and more importance. In this tutorial i will taking about the “Oracle Coherence”.

Note : This article assume that you already have the coherence downloaded and you know how to run a cache-server.

These days its very common to pre-populate the cache before the application starts using the data. In this example i am going to pre-popuate the cache.

Oracle Coherence revolves around the concept of named caches. The first thing you need to do in your code when working with Coherence is to obtain a reference to a named cache you want to work with. In order to do this, you
need to use the CacheFactory class, which exposes the getCache method as one of its public members.

below is the project structure.
coh-proj-struct

first of all let us write the configurations.
Step 1: applicationContext.xml – this file has the spring capability to instantiate the beans. In this i have written the DataSource object instantiation.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">

	<bean id="dataSource"
		class="org.springframework.jdbc.datasource.DriverManagerDataSource">
		<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
		<property name="url" value="jdbc:oracle:thin:@localhost:1521:xe" />
		<property name="username" value="scott" />
		<property name="password" value="tiger" />
	</bean>
</beans>

let us start configuring our custom cache by creating coherence-cache-config.xml

<cache-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://xmlns.oracle.com/coherence/coherence-cache-config"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-cache-config http://xmlns.oracle.com/coherence/coherence-cache-config/1.0/coherence-cache-config.xsd">	<!-- The defaults element defines factory-wide default settings. -->

	<defaults>
		<serializer system-property="tangosol.coherence.serializer" />
		<socket-provider system-property="tangosol.coherence.socketprovider" />
	</defaults>

	<caching-scheme-mapping>
		<cache-mapping>
			<cache-name>dist-*</cache-name>
			<scheme-name>example-distributed</scheme-name>
			<init-params>
				<init-param>
					<param-name>back-size-limit</param-name>
					<param-value>8MB</param-value>
				</init-param>
			</init-params>
		</cache-mapping>

		<cache-mapping>
			<cache-name>*</cache-name>
			<scheme-name>example-distributed</scheme-name>
		</cache-mapping>
	</caching-scheme-mapping>

	<caching-schemes>
		<!-- Distributed caching scheme. -->
		<distributed-scheme>
			<scheme-name>example-distributed</scheme-name>
			<service-name>DistributedCache</service-name>
			<serializer>
				<instance>
					<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
					<init-params>
						<init-param>
							<param-type>String</param-type>
							<param-value>custom-pof-config.xml</param-value>
						</init-param>
					</init-params>
				</instance>
			</serializer>
			<backing-map-scheme>
				<read-write-backing-map-scheme>
					<internal-cache-scheme>
						<local-scheme>
						</local-scheme>
					</internal-cache-scheme>
				</read-write-backing-map-scheme>
			</backing-map-scheme>
			<autostart>true</autostart>
		</distributed-scheme>
	</caching-schemes>
</cache-config>

for every bean that we write we need to add an entry in the below file custom-pof-config.xml

<?xml version="1.0"?>
<pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config http://xmlns.oracle.com/coherence/coherence-pof-config/1.0/coherence-pof-config.xsd">
	<user-type-list>
		<!-- include all "standard" Coherence POF user types -->
		<include>coherence-pof-config.xml</include>
		<user-type>
			<type-id>1000</type-id>
			<class-name>com.spark.coherence.pof.beans.EmployeeBean</class-name>
		</user-type>
	</user-type-list>
</pof-config>

we need to override the tangosol-coherence-override.xml file with our config file entries as below.

<?xml version='1.0'?>
<coherence
	xmlns="http://xmlns.oracle.com/coherence/coherence-operational-config"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-operational-config http://xmlns.oracle.com/coherence/coherence-operational-config/1.2/coherence-operational-config.xsd">
	<cluster-config>
		<!-- <member-identity> <cluster-name>my-coherance-cluster</cluster-name> 
			</member-identity> -->

		<multicast-listener>
			<address>224.12.1.0</address>
			<port>12100</port>
			<time-to-live>60</time-to-live>
		</multicast-listener>
	</cluster-config>

	<configurable-cache-factory-config>
		<init-params>
			<init-param>
				<param-type>java.lang.String</param-type>
				<param-value system-property="tangosol.coherence.cacheconfig">
					coherence-cache-config.xml
				</param-value>
			</init-param>
		</init-params>
	</configurable-cache-factory-config>
</coherence>

Once done with our configuration file, let us start writing the jdbc connection factory class.

/**
 * 
 */
package com.spark.coherence.jdbc.connection.factory;

import java.sql.Connection;
import java.sql.SQLException;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;

import javax.sql.DataSource;

import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

/**
 * @author Sony
 *
 */
public class CoherenceJdbcConnectionFactory {

	private static Future<ApplicationContext> future;
	private static ExecutorService executorService = Executors.newFixedThreadPool(1);
	
	static{
		future = executorService.submit(new Callable<ApplicationContext>() {
			@Override
			public ApplicationContext call() throws Exception {
				return new ClassPathXmlApplicationContext("applicationContext.xml");
			}
		});
	}
	
	/**
	 * @return java.sql.Connection
	 * @throws InterruptedException
	 * @throws ExecutionException
	 * @throws BeansException
	 * @throws SQLException
	 */
	public static Connection getJdbcConnection() throws InterruptedException, ExecutionException, BeansException, SQLException{
		ApplicationContext applicationContext = future.get();
		return ((DataSource)applicationContext.getBean("dataSource")).getConnection();
	}
	
	/**
	 * @return javax.sql.DataSource
	 * @throws InterruptedException
	 * @throws ExecutionException
	 * @throws BeansException
	 * @throws SQLException
	 */
	public static DataSource getJdbcDataSource() throws InterruptedException, ExecutionException, BeansException, SQLException{
		ApplicationContext applicationContext = future.get();
		return ((DataSource)applicationContext.getBean("dataSource"));
	}
	
	private static ExecutorService getCurrentExecutorService(){
		return executorService;
	}
	
	public static void shutDownCurrentExecutorService(){
		getCurrentExecutorService().shutdown();
	}
}

Now let us create pojo classes, there are two classes explained here one is Entity.java and other is EmployeeBean.java. The Entity.java is used as Key for the EmployeeBean objects

/**
 * 
 */
package com.spark.coherence.pof.beans;

import java.io.IOException;

import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;

/**
 * @author Sony
 * 
 */
public class Entity<T> implements PortableObject{

	public T key;

	/**
	 * @return the key
	 */
	public T getKey() {
		return key;
	}

	/**
	 * @param key
	 *            the key to set
	 */
	public void setKey(T key) {
		this.key = key;
	}

	@SuppressWarnings("unchecked")
	@Override
	public void readExternal(PofReader pofReader) throws IOException {
		this.key = (T) pofReader.readObject(0);
	}

	@Override
	public void writeExternal(PofWriter pofWriter) throws IOException {
		pofWriter.writeObject(0, key);
	}

}
/**
 * 
 */
package com.spark.coherence.pof.beans;

import java.io.IOException;
import java.util.Date;

import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;

/**
 * @author Sony
 * 
 */
public class EmployeeBean extends Entity<Integer> {

	private int employeeId;
	private String employeeName;
	private String job;
	private int managerId;
	private Date hireDate;
	private int salary;
	private int commission;
	private int deptNo;

	public EmployeeBean() {
		// TODO Auto-generated constructor stub
	}

	public EmployeeBean(int employeeId) {
		super.setKey(employeeId);
		this.employeeId = employeeId;
	}

	/**
	 * @return the employeeId
	 */
	public int getEmployeeId() {
		return employeeId;
	}

	/**
	 * @param employeeId
	 *            the employeeId to set
	 */
	public void setEmployeeId(int employeeId) {
		this.employeeId = employeeId;
	}

	/**
	 * @return the employeeName
	 */
	public String getEmployeeName() {
		return employeeName;
	}

	/**
	 * @param employeeName
	 *            the employeeName to set
	 */
	public void setEmployeeName(String employeeName) {
		this.employeeName = employeeName;
	}

	public String getJob() {
		return job;
	}

	public void setJob(String job) {
		this.job = job;
	}

	public int getManagerId() {
		return managerId;
	}

	public void setManagerId(int managerId) {
		this.managerId = managerId;
	}

	public Date getHireDate() {
		return hireDate;
	}

	public void setHireDate(Date hireDate) {
		this.hireDate = hireDate;
	}

	public int getSalary() {
		return salary;
	}

	public void setSalary(int salary) {
		this.salary = salary;
	}

	public int getCommission() {
		return commission;
	}

	public void setCommission(int commission) {
		this.commission = commission;
	}

	public int getDeptNo() {
		return deptNo;
	}

	public void setDeptNo(int deptNo) {
		this.deptNo = deptNo;
	}

	@Override
	public void readExternal(PofReader pofReader) throws IOException {
		this.employeeId = pofReader.readInt(0);
		this.employeeName = pofReader.readString(1);
		this.job = pofReader.readString(2);
		this.managerId = pofReader.readInt(3);
		this.hireDate = pofReader.readDate(4);
		this.salary = pofReader.readInt(5);
		this.commission = pofReader.readInt(6);
		this.deptNo = pofReader.readInt(7);
	}

	@Override
	public void writeExternal(PofWriter pofWriter) throws IOException {
		pofWriter.writeInt(0, employeeId);
		pofWriter.writeString(1, employeeName);
		pofWriter.writeString(2, job);
		pofWriter.writeInt(3, managerId);
		pofWriter.writeDate(4, hireDate);
		pofWriter.writeInt(5, salary);
		pofWriter.writeInt(6, commission);
		pofWriter.writeInt(7, deptNo);
	}

}

Let’s start with writing logic to fetch from Oracle database and put that into Map

/**
 * 
 */
package com.spark.coherence.jdbc.dao.impl;

import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import javax.sql.DataSource;

import org.springframework.dao.DataAccessException;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.ResultSetExtractor;

import com.spark.coherence.pof.beans.EmployeeBean;

/**
 * @author Sony
 *
 */
public class EmployeeDao {

	private JdbcTemplate jdbcTemplate;

	public EmployeeDao(DataSource dataSource) {
		this.jdbcTemplate = new JdbcTemplate(dataSource);
	}

	public Map<Integer, EmployeeBean> getAllEmployees() {
		String sql = "select * from emp";

		Map<Integer, EmployeeBean> map = this.jdbcTemplate.query(sql,
				new ResultSetExtractor<Map<Integer, EmployeeBean>>() {

					Map<Integer, EmployeeBean> map = new HashMap<>();

					public java.util.Map<Integer, EmployeeBean> extractData(
							ResultSet resultSet) throws SQLException,
							DataAccessException {
						
						while (resultSet.next()) {
							EmployeeBean employeeBean = new EmployeeBean(
									resultSet.getInt("empno"));
							employeeBean.setEmployeeName(resultSet
									.getString("ename"));
							employeeBean.setJob(resultSet.getString("job"));
							employeeBean.setCommission(resultSet.getInt("comm"));
							employeeBean.setDeptNo(resultSet.getInt("deptno"));
							employeeBean.setHireDate(resultSet
									.getDate("hiredate"));
							employeeBean.setManagerId(resultSet.getInt("mgr"));
							employeeBean.setSalary(resultSet.getInt("sal"));

							map.put(employeeBean.getKey(), employeeBean);
						}

						return map;

					};
				});

		return map;
	}
}

Let us create a CacheRepository class from which we get the NamedCache instance as shown below.

/**
 * 
 */
package com.spark.coherence.cache.repository;

import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;

/**
 * @author Sony
 *
 */
public class CacheRepository {

	public static NamedCache getEmployeeCache(){
		return CacheFactory.getCache("dist-employee-cache");
	}
}

Now its time for to call the service layer and then put the data to cache.

/**
 * 
 */
package com.spark.coherence.cache.service.impl;

import java.sql.SQLException;
import java.util.Map;
import java.util.concurrent.ExecutionException;

import org.springframework.beans.BeansException;

import com.spark.coherence.cache.repository.CacheRepository;
import com.spark.coherence.jdbc.connection.factory.CoherenceJdbcConnectionFactory;
import com.spark.coherence.jdbc.dao.impl.EmployeeDao;
import com.spark.coherence.pof.beans.EmployeeBean;
import com.tangosol.net.NamedCache;


/**
 * @author Sony
 *
 */
public class CacheService {

	public void preLoadCache(){
		try {
			Map<Integer, EmployeeBean> map = new EmployeeDao(CoherenceJdbcConnectionFactory.getJdbcDataSource()).getAllEmployees();
			NamedCache namedCache = CacheRepository.getEmployeeCache();
			namedCache.putAll(map);
		} catch (BeansException | InterruptedException | ExecutionException
				| SQLException e) {
			e.printStackTrace();
		}
	}
}

That is it we have done with coding part its now time for us to test the application by running the main method.

/**
 * 
 */
package com.spark.main;

import java.util.ArrayList;
import java.util.Collection;

import com.spark.coherence.cache.repository.CacheRepository;
import com.spark.coherence.cache.service.impl.CacheService;
import com.spark.coherence.jdbc.connection.factory.CoherenceJdbcConnectionFactory;
import com.spark.coherence.pof.beans.EmployeeBean;
import com.tangosol.net.NamedCache;


/**
 * @author Sony
 *
 */
public class CoherenceTest {

	/**
	 * @param args
	 */
	@SuppressWarnings({ "unused", "unchecked" })
	public static void main(String[] args) {
		NamedCache namedCache = CacheRepository.getEmployeeCache();
		System.out.println(namedCache.getCacheService().getInfo().getServiceName());
		
		new CacheService().preLoadCache();
		EmployeeBean employeeBean = (EmployeeBean)namedCache.get(7782);
		System.out.println(employeeBean.getEmployeeName());

		Collection<Integer> ids = new ArrayList<>();
		ids.add(7654);
		ids.add(7698);
		
		Collection<EmployeeBean> employeeBeans = namedCache.getAll(ids).values();
		for (EmployeeBean employeeBean2 : employeeBeans) {
			System.out.println(employeeBean2.getEmployeeName());
		}
		
		CoherenceJdbcConnectionFactory.shutDownCurrentExecutorService();
	}

}

Please observer the main class carefully i have demonstrated the two public methods get(Object) and getAll(Collection). In my next post i will be demonstrating to use the CacheListeners to know if cache is update,deleted or inserted.

Happy Coherence and Happy Coding 🙂

Handling Oracle Ref Cursors with Complex Objects in Spring

this write up is very simple if you already followed my previous two topics. In this we shall create a ref cursor which will return group of complex Objects. so first lets have a look at our database part as below.

create or replace 
package pkg_state as
  procedure add_state(in_state in state_typ);
  procedure get_state(in_state_id in number,out_state out state_typ);
  <b>TYPE CUR_DATA IS REF CURSOR;
  procedure get_all_states(out_state OUT CUR_DATA);</b>
end pkg_state;
==================================================================================
create or replace 
package body pkg_state as
 
  procedure add_state(in_state in state_typ) is
  begin
    insert into states(id, state) values(in_state.id,in_state);
    commit;
  end add_state;
 
  procedure get_state(in_state_id in number,out_state out state_typ) is
  begin
    select state into out_state from states where id = in_state_id;
  end get_state;
  
  <b>procedure get_all_states(out_state out CUR_DATA) is
  begin
    OPEN out_state FOR
    SELECT id,state
    FROM states
    ORDER BY id asc;
  end get_all_states;</b>
  
end pkg_state;

Now we shall start coding our DataAccess Layer Implementation as shown below.

/**
 * 
 */
package com.spark.spring.data.dao;

import java.sql.SQLException;
import java.util.Collections;
import java.util.List;
import java.util.Map;

import javax.sql.DataSource;

import oracle.jdbc.driver.OracleTypes;

import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.core.simple.SimpleJdbcCall;

import com.spark.spring.data.dao.rowmappers.GetAllStatesRowMapper;
import com.spark.spring.data.dao.rowmappers.SqlDataStateData;
import com.spark.spring.data.domain.State;

/**
 * @author PavanKumar Mantha
 * 
 */
public class StateDaoImpl {
	SimpleJdbcCall getAllStatesJdbcCall;

	public StateDaoImpl(DataSource dataSource) {

		this.getAllStatesJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName("pkg_state")
				.withProcedureName("get_all_states")
				.declareParameters(
						new SqlOutParameter("out_state", OracleTypes.CURSOR,
								new GetAllStatesRowMapper()));
	}
	
	public List<State> getAllStates(){
		getAllStatesJdbcCall.execute();
		Map<Integer, Object> states = GetAllStatesRowMapper.getObjects();
		List<State> list = null;
		try {
			list = GetAllStatesRowMapper.getAllStates(states);
		} catch (SQLException e) {
			e.printStackTrace();
		}
		return list;
	}
}

In the above code we have used a row mapper class which will fetch complex objects from the ref cursor and converts them to java domain objects
for domain objects please refer to my previous write up

/**
 * 
 */
package com.spark.spring.data.dao.rowmappers;

import java.math.BigDecimal;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Set;

import oracle.sql.STRUCT;

import org.springframework.jdbc.core.RowCallbackHandler;

import com.spark.spring.data.domain.Capital;
import com.spark.spring.data.domain.State;

/**
 * @author PavanKumar Mantha
 *
 */
public class GetAllStatesRowMapper implements RowCallbackHandler {

	static Map<Integer, Object> states = new HashMap<Integer, Object>();

	public void processRow(ResultSet rs) throws SQLException {
		states.put(((BigDecimal)rs.getObject(1)).intValue(), rs.getObject(2));
	}

	public static Map<Integer, Object> getObjects(){
		return states;
	}
	
	/**
	 * @param states
	 * @return
	 * @throws SQLException
	 */
	public static List<State> getAllStates(Map<Integer, Object> states) throws SQLException{
		
		List<State> listStates = new ArrayList<State>();
		Object[] stateObjects;
		Object[] capitals;
		
		Set<Integer> keys = states.keySet();
		Iterator<Integer> it = keys.iterator();
		while(it.hasNext()){
			
			State state = new State();
			Capital capital = new Capital();
			
			stateObjects = ((STRUCT)states.get(it.next())).getAttributes();
			state.setId(((BigDecimal)stateObjects[0]).intValue());
			state.setStateName((String)stateObjects[1]);
			
			capitals = ((STRUCT)stateObjects[2]).getAttributes();
			capital.setCapitalId(((BigDecimal)capitals[0]).intValue());
			capital.setCapitalName((String)capitals[1]);
			
			state.setCapital(capital);
			listStates.add(state);
		}
		return listStates;
	}
}

Once the implementation of DAO and RowMapper classes are done, its time for us to test with the main class as below.

/**
 * 
 */
package com.spark.spring.data.test;

import java.util.List;

import com.spark.spring.data.dao.ActorDaoImpl;
import com.spark.spring.data.dao.StateDaoImpl;
import com.spark.spring.data.domain.Capital;
import com.spark.spring.data.domain.State;
import com.spark.spring.data.sqltypes.ActorType;
import com.spark.spring.data.util.SpringDataUtil;

/**
 * @author PavanKumar Mantha
 * 
 */
public class SpringDataTest {

	/**
	 * @param args
	 */
	public static void main(String[] args) {
		try {			
			StateDaoImpl stateDaoImpl = (StateDaoImpl)SpringDataUtil.getObject("stateDaoImpl");	
			
			 List<State> states = stateDaoImpl.getAllStates();
			 for (State s: states) {
				System.out.println(s.getStateName()+" : s.getCapital().getCapitalName());
			}
			
		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}

For config files like application-context.xml and jdbc.properties please refer to my previous post and copy the same files to work with

Thats it with the simple above code we can handle N number of complex objects that gets returned from ref cursor.

Happy Oracle Collections and Spring 🙂

Handling Oracle Collections using Spring 3 [Part- 2]

As a continuation to my first part here we can discuss how can we handle complex object types from java to oracle. As an example lets take a “State Object which internally have Capital object” now this complete object should get persisted in single column of table with out joins and multiple tables. lets create database first as below.

create or replace
type capital_typ as object (id number, capital_name varchar2(30))
=====================================================
create or replace
type state_typ as object(id number,state_name varchar2(30),capital capital_typ);
=====================================================
create table states(id number,state state_typ);

Now its time to write the procedure implementation for the above created objects.

create or replace
package pkg_state as
  procedure add_state(in_state in state_typ);
  procedure get_state(in_state_id in number,out_state out state_typ);
end pkg_state;
==============================================================================
create or replace
package body pkg_state as
 
  procedure add_state(in_state in state_typ) is
  begin
    insert into states(id, state) values(in_state.id,in_state);
    commit;
  end add_state;
 
  procedure get_state(in_state_id in number,out_state out state_typ) is
  begin
    select state into out_state from states where id = in_state_id;
  end get_state;
end pkg_state;

Once the database is ready now let us start coding or domain objects as we did in the part-1
State.java

/**
 * 
 */
package com.spark.spring.data.domain;

import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;

/**
 * @author p7164517
 * 
 */
public class State implements SQLData{

	private int id;
	private String stateName;
	Capital capital;

	/**
	 * @return the capital
	 */
	public Capital getCapital() {
		return capital;
	}

	/**
	 * @param capital the capital to set
	 */
	public void setCapital(Capital capital) {
		this.capital = capital;
	}

	/**
	 * @return the id
	 */
	public int getId() {
		return id;
	}

	/**
	 * @param id
	 *            the id to set
	 */
	public void setId(int id) {
		this.id = id;
	}

	/**
	 * @return the stateName
	 */
	public String getStateName() {
		return stateName;
	}

	/**
	 * @param stateName
	 *            the stateName to set
	 */
	public void setStateName(String stateName) {
		this.stateName = stateName;
	}

	/*
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#hashCode()
	 */
	@Override
	public int hashCode() {
		final int prime = 31;
		int result = 1;
		result = prime * result + id;
		return result;
	}

	/*
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#equals(java.lang.Object)
	 */
	@Override
	public boolean equals(Object obj) {
		if (this == obj) {
			return true;
		}
		if (obj == null) {
			return false;
		}
		if (getClass() != obj.getClass()) {
			return false;
		}
		State other = (State) obj;
		if (id != other.id) {
			return false;
		}
		return true;
	}

	public String getSQLTypeName() throws SQLException {
		return "STATE_TYP";
	}

	public void readSQL(SQLInput sqlInput, String typeName) throws SQLException {
		setId(sqlInput.readInt());
		setStateName(sqlInput.readString());
		setCapital((Capital) sqlInput.readObject());
	}

	public void writeSQL(SQLOutput sqlOutput) throws SQLException {
		sqlOutput.writeInt(getId());
		sqlOutput.writeString(getStateName());
		sqlOutput.writeObject(getCapital());
	}

}

Capital.java

/**
 * 
 */
package com.spark.spring.data.domain;

import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;

/**
 * @author p7164517
 * 
 */
public class Capital implements SQLData {

	private int capitalId;
	private String capitalName;

	/**
	 * @return the capitalId
	 */
	public int getCapitalId() {
		return capitalId;
	}

	/**
	 * @param capitalId
	 *            the capitalId to set
	 */
	public void setCapitalId(int capitalId) {
		this.capitalId = capitalId;
	}

	/**
	 * @return the capitalName
	 */
	public String getCapitalName() {
		return capitalName;
	}

	/**
	 * @param capitalName
	 *            the capitalName to set
	 */
	public void setCapitalName(String capitalName) {
		this.capitalName = capitalName;
	}

	/*
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#hashCode()
	 */
	@Override
	public int hashCode() {
		final int prime = 31;
		int result = 1;
		result = prime * result + capitalId;
		return result;
	}

	/*
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#equals(java.lang.Object)
	 */
	@Override
	public boolean equals(Object obj) {
		if (this == obj) {
			return true;
		}
		if (obj == null) {
			return false;
		}
		if (getClass() != obj.getClass()) {
			return false;
		}
		Capital other = (Capital) obj;
		if (capitalId != other.capitalId) {
			return false;
		}
		return true;
	}

	public String getSQLTypeName() throws SQLException {
		return "CAPITAL_TYP";
	}

	public void readSQL(SQLInput sqlInput, String typeName) throws SQLException {
		setCapitalId(sqlInput.readInt());
		setCapitalName(sqlInput.readString());
	}

	public void writeSQL(SQLOutput sqlOutput) throws SQLException {
		sqlOutput.writeInt(getCapitalId());
		sqlOutput.writeString(getCapitalName());
	}

}

Lets start Coding or Dao Implementation class.

/**
 * 
 */
package com.spark.spring.data.dao;

import java.util.Collections;
import java.util.Map;

import javax.sql.DataSource;

import oracle.jdbc.driver.OracleTypes;

import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.core.simple.SimpleJdbcCall;

import com.spark.spring.data.dao.rowmappers.SqlDataStateData;
import com.spark.spring.data.domain.State;

/**
 * @author PavanKumar Mantha
 * 
 */
public class StateDaoImpl {
	SimpleJdbcCall addStateJdbcCall, getStateJdbcCall;

	public StateDaoImpl(DataSource dataSource) {
		this.addStateJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName("pkg_state")
				.withProcedureName("add_state")
				.declareParameters(
						new SqlParameter("in_state", OracleTypes.STRUCT,
								"STATE_TYP"));

		this.getStateJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName("pkg_state")
				.withProcedureName("get_state")
				.declareParameters(
						new SqlOutParameter("out_state", OracleTypes.STRUCT,
								"STATE_TYP", new SqlDataStateData()));
	}

	@SuppressWarnings({ "rawtypes", "unchecked" })
	public void addState(final State state) {
		Map in = Collections.singletonMap("in_state", state);
		addStateJdbcCall.execute(in);
	}

	@SuppressWarnings({ "rawtypes", "unchecked" })
	public State getState(final int state_id) {
		Map in = Collections.singletonMap("in_state_id", state_id);
		State state = getStateJdbcCall.executeObject(State.class, in);
		return state;
	}
}

Now lets code our row mapper class. here observe in State Object we get one more object called Capital(called Object Composition)

/**
 * 
 */
package com.spark.spring.data.dao.rowmappers;

import java.math.BigDecimal;
import java.sql.CallableStatement;
import java.sql.SQLException;

import oracle.sql.STRUCT;

import org.springframework.jdbc.core.SqlReturnType;

import com.spark.spring.data.domain.Capital;
import com.spark.spring.data.domain.State;

/**
 * @author p7164517
 *
 */
public class SqlDataStateData implements SqlReturnType{

	Object[] states;
	Object[] capitals;

	public Object getTypeValue(CallableStatement cs, int parameterIndex, int oraType,
			String typeName) throws SQLException {
		
		State state = new State();
		Capital capital = new Capital();
		
		states = ((STRUCT)cs.getObject(parameterIndex)).getAttributes();
		state.setId(((BigDecimal)states[0]).intValue());
		state.setStateName((String)states[1]);
		
		capitals = ((STRUCT)states[2]).getAttributes();
		capital.setCapitalId(((BigDecimal)capitals[0]).intValue());
		capital.setCapitalName((String)capitals[1]);
		
		state.setCapital(capital);
		return state;
	}

}

Now lets write a helper class which actually create an Object,DataSource,ApplicationContext objects as shown below.

/**
 * 
 */
package com.spark.spring.data.util;

import java.sql.Connection;
import java.sql.SQLException;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;

import javax.sql.DataSource;

import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;

import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

/**
 * @author PavanKumar Mantha
 * 
 */
public class SpringDataUtil {

	private static Future<ApplicationContext> applicationContext;

	static {
		ExecutorService executorService = Executors.newFixedThreadPool(1);
		applicationContext = executorService
				.submit(new Callable<ApplicationContext>() {
					public ApplicationContext call() throws Exception {
						return new ClassPathXmlApplicationContext(
								"application-context.xml");
					}

				});
	}

	/**
	 * @return
	 */
	public static ApplicationContext getContext() {
		try {
			return applicationContext.get();
		} catch (Exception e) {
			throw new RuntimeException(e.getMessage());
		}
	}

	/**
	 * @return
	 */
	public static DataSource getOracleDataSource() {
		try {
			return (DataSource) getContext().getBean("orclDataSource");
		} catch (Exception e) {
			throw new RuntimeException();
		}
	}

	/**
	 * @return
	 */
	public static DataSource getMysqleDataSource() {
		try {
			return (DataSource) getContext().getBean("mysqlDataSource");
		} catch (Exception e) {
			throw new RuntimeException();
		}
	}

	/**
	 * @param beanName
	 * @return
	 */
	public static Object getObject(String beanName) {
		return getContext().getBean(beanName);
	}

	/**
	 * @param connection
	 * @param sqlType
	 * @param typeName
	 * @param idsArray
	 * @return
	 * @throws SQLException
	 */
	public static <T> Object getOrclArrayType(Connection connection,
			int sqlType, String typeName, T[] idsArray) throws SQLException {

		ArrayDescriptor arrayDescriptor = new ArrayDescriptor(typeName,
				connection);
		return new ARRAY(arrayDescriptor, connection, idsArray);
	}
}

now its time for us to write application-context.xml and jdbc.properties as below.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:cache="http://www.springframework.org/schema/cache"
	xmlns:context="http://www.springframework.org/schema/context"
	xmlns:jdbc="http://www.springframework.org/schema/jdbc" xmlns:jee="http://www.springframework.org/schema/jee"
	xmlns:lang="http://www.springframework.org/schema/lang" xmlns:mvc="http://www.springframework.org/schema/mvc"
	xmlns:task="http://www.springframework.org/schema/task" xmlns:tx="http://www.springframework.org/schema/tx"
	xmlns:util="http://www.springframework.org/schema/util"
	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
		http://www.springframework.org/schema/cache http://www.springframework.org/schema/cache/spring-cache-3.2.xsd
		http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.2.xsd
		http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.2.xsd
		http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-3.2.xsd
		http://www.springframework.org/schema/lang http://www.springframework.org/schema/lang/spring-lang-3.2.xsd
		http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-3.2.xsd
		http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task-3.2.xsd
		http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.2.xsd
		http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.2.xsd">

	<context:property-placeholder location="classpath:jdbc.properties" />

	<bean id="orclDataSource" class="oracle.jdbc.pool.OracleDataSource"
		destroy-method="close">
		<property name="URL" value="${orcl.jdbc.url}" />
		<property name="user" value="${orcl.jdbc.username}" />
		<property name="password" value="${orcl.jdbc.password}" />
		<property name="connectionCachingEnabled" value="true" />
	</bean>

	<bean name="mysqlDataSource"
		class="org.springframework.jdbc.datasource.DriverManagerDataSource">
		<property name="driverClassName" value="${mysql.jdbc.driver}" />
		<property name="url" value="${mysql.jdbc.url}" />
		<property name="username" value="${mysql.jdbc.user}" />
		<property name="password" value="${mysql.jdbc.password}" />
	</bean>
	
	<bean id="actorDaoImpl" class="com.spark.spring.data.dao.ActorDaoImpl">
		<constructor-arg ref="orclDataSource"></constructor-arg>
	</bean>
	
	<bean id="stateDaoImpl" class="com.spark.spring.data.dao.StateDaoImpl">
		<constructor-arg ref="orclDataSource"></constructor-arg>
	</bean>

</beans>

jdbc.properties:

orcl.jdbc.url=jdbc:oracle:thin:@localhost:1521:XE
orcl.jdbc.username=scott
orcl.jdbc.password=tiger

mysql.jdbc.driver=com.mysql.jdbc.Driver
mysql.jdbc.url=jdbc:mysql//localhost:3306/test
mysql.jdbc.user=root
mysql.jdbc.password=root

now as a final touch here we write a main class to test this application

/**
 * 
 */
package com.spark.spring.data.test;

import com.spark.spring.data.dao.ActorDaoImpl;
import com.spark.spring.data.dao.StateDaoImpl;
import com.spark.spring.data.domain.Capital;
import com.spark.spring.data.domain.State;
import com.spark.spring.data.sqltypes.ActorType;
import com.spark.spring.data.util.SpringDataUtil;

/**
 * @author PavanKumar Mantha
 * 
 */
public class SpringDataTest {

	/**
	 * @param args
	 */
	public static void main(String[] args) {
		try {
			
			
			StateDaoImpl stateDaoImpl = (StateDaoImpl)SpringDataUtil.getObject("stateDaoImpl");
				
			Capital capital = new Capital();
			capital.setCapitalId(1);
			capital.setCapitalName("Austin");
			
			State state = new State();
			state.setId(1);
			state.setStateName("Texas");
			state.setCapital(capital);
			
			stateDaoImpl.addState(state);
			
			State state2 = stateDaoImpl.getState(1);
			System.out.println("State : "+state2.getStateName()+" Capital:"+state2.getCapital().getCapitalName());
			
		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}

Note : uncomment actorDaoImpl.addActor(actorType); to insert the record and buttom three lines to delete set of records as a batch delete operation passing array of ids.
The project structure should resemble the screen as below once everything is done.
OraCollections

Happy Oracle Collections and Spring 🙂

Handling Oracle Collections using Spring 3 [Part- 1]

Hi All,these days i got little busy with my new assignment of work in my project finally i took some time today and thought of writing a very nice concept on Oracle Collections and Handling them in Spring framework.Oracle has given a very awesome feature called collections such as “TabType, ObjectType, VArray” etc.

Let us create database structure as shown as below.

create or replace 
type actor_type as object (id number(10),name varchar2(50),age number);
=============================================================
 CREATE TABLE "ACTOR" 
   (	"ID" NUMBER(10,0), 
	"NAME" VARCHAR2(50 BYTE), 
	"AGE" NUMBER, 
	 PRIMARY KEY ("ID") );
=============================================================

After we create Object types its time to create db procedures which will actually make use of above created objects.

create or replace 
package PKG_ACTOR AS
  PROCEDURE add_actor (in_actor IN actor_type);
  PROCEDURE get_actor (in_actor_id IN NUMBER, out_actor OUT actor_type);
  PROCEDURE delete_actors (in_actor_ids IN actor_id_array);
END PKG_ACTOR;
==============================================================
create or replace 
package BODY PKG_ACTOR AS
  PROCEDURE add_actor (in_actor IN actor_type) IS
    BEGIN
      INSERT into actor values(in_actor.id,in_actor.name,in_actor.AGE);
      COMMIT;
    END add_actor;
    
    PROCEDURE get_actor (in_actor_id IN NUMBER, out_actor OUT actor_type) IS
    BEGIN
      SELECT actor_type(id, name, age)  into out_actor FROM actor WHERE id = in_actor_id;
    END get_actor;
    
    PROCEDURE delete_actors (in_actor_ids IN actor_id_array) IS
    begin
      FOR i IN 1..in_actor_ids.count loop
        DELETE FROM actor WHERE id = in_actor_ids(i);
        COMMIT;
      END LOOP;
    end delete_actors;
END PKG_ACTOR;
============================================================================

Thats all from database side lets start handling these collection objects using Spring. let use first create domain object which will be serialized to database as below.

/**
 * 
 */
package com.spark.spring.data.domain;

/**
 * @author PavanKumar Mantha
 * 
 */
public class Actor {

	private Long id;
	private String name;
	private int age;

	public Long getId() {
		return id;
	}

	public void setId(Long id) {
		this.id = id;
	}

	public String getName() {
		return name;
	}

	public void setName(String name) {
		this.name = name;
	}

	public int getAge() {
		return age;
	}

	public void setAge(int age) {
		this.age = age;
	}

	public String toString() {
		return "Actor: [" + id + "] " + name + " " + age;
	}

	/**
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#hashCode()
	 */

	@Override
	public int hashCode() {
		final int prime = 31;
		int result = 1;
		result = prime * result + ((id == null) ? 0 : id.hashCode());
		return result;
	}

	/**
	 * (non-Javadoc)
	 * 
	 * @see java.lang.Object#equals(java.lang.Object)
	 */
	@Override
	public boolean equals(Object obj) {
		if (this == obj) {
			return true;
		}
		if (obj == null) {
			return false;
		}
		if (!(obj instanceof Actor)) {
			return false;
		}
		Actor other = (Actor) obj;
		if (id == null) {
			if (other.id != null) {
				return false;
			}
		} else if (!id.equals(other.id)) {
			return false;
		}
		return true;
	}

}

Now let us create an oracle Type object that will exactly interact with the underlying database.

/**
 * 
 */
package com.spark.spring.data.sqltypes;

import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;

import com.spark.spring.data.domain.Actor;

/**
 * @author PavanKumar Mantha
 * 
 */
public class ActorType extends Actor implements SQLData {

	private final String sqlType = "ACTOR_TYPE";

	/**
	 * (non-Javadoc)
	 * 
	 * @see java.sql.SQLData#getSQLTypeName()
	 */
	public String getSQLTypeName() throws SQLException {
		return sqlType;
	}

	/**
	 * (non-Javadoc)
	 * 
	 * @see java.sql.SQLData#readSQL(java.sql.SQLInput, java.lang.String)
	 */
	public void readSQL(SQLInput sqlInput, String str) throws SQLException {
		setId(Long.valueOf(sqlInput.readLong()));
		setName(sqlInput.readString());
		setAge(sqlInput.readInt());
	}

	/**
	 * (non-Javadoc)
	 * 
	 * @see java.sql.SQLData#writeSQL(java.sql.SQLOutput)
	 */
	public void writeSQL(SQLOutput sqlOutput) throws SQLException {
		sqlOutput.writeLong(getId().longValue());
		sqlOutput.writeString(getName());
		sqlOutput.writeInt(getAge());
	}

}

After our Type object is ready we shall start creating the Dao Implementation class which is responsible for persisting the domain object to database.

/**
 * 
 */
package com.spark.spring.data.dao;

import java.sql.SQLException;
import java.sql.Types;
import java.util.Collections;
import java.util.Map;

import javax.sql.DataSource;

import oracle.jdbc.driver.OracleTypes;

import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.core.SqlReturnType;
import org.springframework.jdbc.core.simple.SimpleJdbcCall;

import com.spark.spring.data.dao.rowmappers.SqlReturnSqlData;
import com.spark.spring.data.sqltypes.ActorType;
import com.spark.spring.data.util.SpringDataUtil;

/**
 * @author PavanKumar Mantha
 * 
 */
public class ActorDaoImpl {

	SimpleJdbcCall addActorJdbcCall, getActorJdbcCall, deleteActorJdbcCall;

	private static final String ADD_PROC_NAME = "add_actor";
	private static final String GET_PROC_NAME = "get_actor";

	private static final String CATALOG_NAME = "PKG_ACTOR";

	private final String TAB_TYPE = "ACTOR_TYPE";
	private final String ARRAY_TYPE = "ACTOR_ID_ARRAY";

	private final String IN_PARAM = "in_actor";
	private final String OUT_PARAM = "out_actor";

	public ActorDaoImpl(DataSource dataSource) {

		this.addActorJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName(CATALOG_NAME)
				.withProcedureName(ADD_PROC_NAME)
				.declareParameters(
						new SqlParameter(IN_PARAM, OracleTypes.STRUCT, TAB_TYPE));

		this.getActorJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName(CATALOG_NAME)
				.withProcedureName(GET_PROC_NAME)
				.declareParameters(
						new SqlOutParameter(OUT_PARAM, OracleTypes.STRUCT,
								TAB_TYPE,
								(SqlReturnType) new SqlReturnSqlData()));

		this.deleteActorJdbcCall = new SimpleJdbcCall(dataSource)
				.withCatalogName(CATALOG_NAME)
				.withProcedureName("delete_actors")
				.withoutProcedureColumnMetaDataAccess()
				.declareParameters(
						new SqlParameter("in_actor_ids", OracleTypes.ARRAY,
								"ACTOR_ID_ARRAY"));
	}

	/**
	 * @param actorType
	 */
	@SuppressWarnings({ "unchecked", "rawtypes" })
	public void addActor(final ActorType actorType) {
		Map inParams = Collections.singletonMap(IN_PARAM, actorType);
		addActorJdbcCall.execute(inParams);
	}

	/**
	 * @param actorId
	 * @return
	 * @throws SQLException
	 */
	@SuppressWarnings({ "unchecked", "rawtypes" })
	public ActorType getActor(final int actorId) throws SQLException{
		Map map = Collections.singletonMap("in_actor_id", actorId);
		return (ActorType) getActorJdbcCall.executeObject(ActorType.class, map);
	}

	/**
	 * @param ids
	 */
	@SuppressWarnings({ "unchecked", "rawtypes" })
	public void deleteActors(final Long[] ids) {
		Map in = null;
		try {
			in = Collections.singletonMap("in_actor_ids", SpringDataUtil
					.getOrclArrayType(SpringDataUtil.getOracleDataSource()
							.getConnection(), Types.ARRAY, ARRAY_TYPE, ids));
		} catch (SQLException e) {
			e.printStackTrace();
		}
		deleteActorJdbcCall.execute(in);
	}
}

Note : Observer only the type names in above code are case sensitive because oracle stores the Metadata information in capital case.
Now we have to implement the RowMapper class which will map the columns from database to our Java Domain object as below.

/**
 * 
 */
package com.spark.spring.data.dao.rowmappers;

import java.math.BigDecimal;
import java.sql.CallableStatement;
import java.sql.SQLException;

import oracle.sql.ARRAY;
import oracle.sql.Datum;
import oracle.sql.STRUCT;

import org.springframework.jdbc.core.SqlReturnType;

import com.spark.spring.data.sqltypes.ActorType;

/**
 * @author PavanKumar Mantha
 * @param <T>
 * 
 */
public class SqlReturnSqlData implements SqlReturnType {

	Object[] objects;
	Datum[] datums;

	/**
	 * (non-Javadoc)
	 * @see org.springframework.jdbc.core.SqlReturnType#getTypeValue(java.sql.CallableStatement, int, int, java.lang.String)
	 */
	public Object getTypeValue(CallableStatement cs, int parameterIndex,
			int sqlType, String typeName) throws SQLException {
		
		ActorType actorType = new ActorType();
		objects = ((STRUCT) cs.getObject(parameterIndex)).getAttributes();
		actorType.setId(((BigDecimal) objects[0]).longValue());
		actorType.setName((String) objects[1]);
		actorType.setAge(((BigDecimal) objects[2]).intValue());
		return actorType;
	}

}

Now lets write a helper class which actually create an Object,DataSource,ApplicationContext objects as shown below.

/**
 * 
 */
package com.spark.spring.data.util;

import java.sql.Connection;
import java.sql.SQLException;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;

import javax.sql.DataSource;

import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;

import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

/**
 * @author PavanKumar Mantha
 * 
 */
public class SpringDataUtil {

	private static Future<ApplicationContext> applicationContext;

	static {
		ExecutorService executorService = Executors.newFixedThreadPool(1);
		applicationContext = executorService
				.submit(new Callable<ApplicationContext>() {
					public ApplicationContext call() throws Exception {
						return new ClassPathXmlApplicationContext(
								"application-context.xml");
					}

				});
	}

	/**
	 * @return
	 */
	public static ApplicationContext getContext() {
		try {
			return applicationContext.get();
		} catch (Exception e) {
			throw new RuntimeException(e.getMessage());
		}
	}

	/**
	 * @return
	 */
	public static DataSource getOracleDataSource() {
		try {
			return (DataSource) getContext().getBean("orclDataSource");
		} catch (Exception e) {
			throw new RuntimeException();
		}
	}

	/**
	 * @return
	 */
	public static DataSource getMysqleDataSource() {
		try {
			return (DataSource) getContext().getBean("mysqlDataSource");
		} catch (Exception e) {
			throw new RuntimeException();
		}
	}

	/**
	 * @param beanName
	 * @return
	 */
	public static Object getObject(String beanName) {
		return getContext().getBean(beanName);
	}

	/**
	 * @param connection
	 * @param sqlType
	 * @param typeName
	 * @param idsArray
	 * @return
	 * @throws SQLException
	 */
	public static <T> Object getOrclArrayType(Connection connection,
			int sqlType, String typeName, T[] idsArray) throws SQLException {

		ArrayDescriptor arrayDescriptor = new ArrayDescriptor(typeName,
				connection);
		return new ARRAY(arrayDescriptor, connection, idsArray);
	}
}

In the above code we used the ExecutorService interface of java’s java.util.concurrent package to load the application-context file from the classpath.

now its time for us to write application-context.xml and jdbc.properties as below.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:cache="http://www.springframework.org/schema/cache"
	xmlns:context="http://www.springframework.org/schema/context"
	xmlns:jdbc="http://www.springframework.org/schema/jdbc" xmlns:jee="http://www.springframework.org/schema/jee"
	xmlns:lang="http://www.springframework.org/schema/lang" xmlns:mvc="http://www.springframework.org/schema/mvc"
	xmlns:task="http://www.springframework.org/schema/task" xmlns:tx="http://www.springframework.org/schema/tx"
	xmlns:util="http://www.springframework.org/schema/util"
	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
		http://www.springframework.org/schema/cache http://www.springframework.org/schema/cache/spring-cache-3.2.xsd
		http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.2.xsd
		http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.2.xsd
		http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-3.2.xsd
		http://www.springframework.org/schema/lang http://www.springframework.org/schema/lang/spring-lang-3.2.xsd
		http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-3.2.xsd
		http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task-3.2.xsd
		http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.2.xsd
		http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.2.xsd">

	<context:property-placeholder location="classpath:jdbc.properties" />

	<bean id="orclDataSource" class="oracle.jdbc.pool.OracleDataSource"
		destroy-method="close">
		<property name="URL" value="${orcl.jdbc.url}" />
		<property name="user" value="${orcl.jdbc.username}" />
		<property name="password" value="${orcl.jdbc.password}" />
		<property name="connectionCachingEnabled" value="true" />
	</bean>

	<bean name="mysqlDataSource"
		class="org.springframework.jdbc.datasource.DriverManagerDataSource">
		<property name="driverClassName" value="${mysql.jdbc.driver}" />
		<property name="url" value="${mysql.jdbc.url}" />
		<property name="username" value="${mysql.jdbc.user}" />
		<property name="password" value="${mysql.jdbc.password}" />
	</bean>
	
	<bean id="actorDaoImpl" class="com.spark.spring.data.dao.ActorDaoImpl">
		<constructor-arg ref="orclDataSource"></constructor-arg>
	</bean>
</beans>

jdbc.properties:

orcl.jdbc.url=jdbc:oracle:thin:@localhost:1521:XE
orcl.jdbc.username=scott
orcl.jdbc.password=tiger

mysql.jdbc.driver=com.mysql.jdbc.Driver
mysql.jdbc.url=jdbc:mysql//localhost:3306/test
mysql.jdbc.user=root
mysql.jdbc.password=root

now as a final touch here we write a main class to test this application

/**
 * 
 */
package com.spark.spring.data.test;

import com.spark.spring.data.dao.ActorDaoImpl;
import com.spark.spring.data.dao.StateDaoImpl;
import com.spark.spring.data.domain.Capital;
import com.spark.spring.data.domain.State;
import com.spark.spring.data.sqltypes.ActorType;
import com.spark.spring.data.util.SpringDataUtil;

/**
 * @author PavanKumar Mantha
 * 
 */
public class SpringDataTest {

	/**
	 * @param args
	 */
	public static void main(String[] args) {
		try {
			
			ActorDaoImpl actorDaoImpl = (ActorDaoImpl)SpringDataUtil.getObject("actorDaoImpl");
			
			ActorType actorType = new ActorType();
			actorType.setId(1L);
			actorType.setName("Pavan Kumar Mantha");
			actorType.setAge(26);
			
			/*actorDaoImpl.addActor(actorType);
			System.out.println("----------------------- Record inserted -----------------------");*/
			
			ActorType type = actorDaoImpl.getActor(2);
			System.out.println(type.getName());
			
			/*Long[] ids = new Long[1];
			ids[0] = 1L;
			actorDaoImpl.deleteActors(ids);*/
			
			
		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}

Note : uncomment actorDaoImpl.addActor(actorType); to insert the record and buttom three lines to delete set of records as a batch delete operation passing array of ids.
The project structure should resemble the screen as below once everything is done.
OraCollections

Lets see the handling of complex Object types in my next Post[Part-2]
Happy Oracle Collections and Spring 🙂

Implementing Oracle Coherence Cache in WebApplications

Hi All, in past i have written one article regarding oracle coherence cache and its simple implementation. from past couple of months we are implementing coherence in our project and i thought to put its implementation here, so lets start with the topic. lets create a database table as below

create table product(product_id varchar2(5),product_name varchar2(50),unit_price varchar2(5),quantity varchar2(5));

prior to creating this example you need to have the coherence and weblogic installed here(code is tested with coherence 3.7 and weblogic 12c)

create a dynamic web project in eclipse and put coherence.jar as dependent lib, which acts as coherence server inside the weblogic application server.
proj_struct

Now lets design the form/page to insert the details
Note: here i am using the servlet 3.0 api and to get the database connection i used the @Resource injection(JSR specification)

<%@ page language="java" contentType="text/html; charset=ISO-8859-1"
	pageEncoding="ISO-8859-1"%>
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title>Insert title here</title>
<script scr="js/jquery.js"></script>
<script>
	$(document).ready(function() {
		if (!("autofocus" in document.createElement("input"))) {
			$("#q").focus();
		}
	});
</script>
</head>
<body>
	<form action="writetoCache" method="post">
	<table>
		<tr>
			<td>Product ID:</td>
			<td><input name="prod_id" id="prodid" autofocus required></td>
		</tr>
		<tr>
			<td>Product Name:</td>
			<td><input name="productName" id="name" required></td>
		</tr>
		<tr>
			<td>Unit Price:</td>
			<td><input name="unitPrice" id="unit_price"></td>
		</tr>
		<tr>
			<td>Total Quantity:</td>
			<td><input name="totQty" id="totQty"></td>
		</tr>
		<tr>
			<td>
				<input type="submit" value="SaveDetails">
			</td>
		</tr>
	</table>
		 
	</form>
</body>
</html>

Now lets write the controller for this above JSP, the aim of this controller is to write the data to database and cache server as well

package com.spark.coherence.cache.controller;

import java.io.IOException;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.Map;

import javax.annotation.Resource;
import javax.servlet.RequestDispatcher;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.sql.DataSource;

import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;

/**
 * Servlet implementation class WriteToCacheController
 */
@WebServlet("/writetoCache")
public class WriteToCacheController extends HttpServlet {
	private static final long serialVersionUID = 1L;

	@Resource(name = "oracleDS")
	DataSource dataSource;

	Connection connection;
	PreparedStatement preparedStatement;

	/**
	 * @see HttpServlet#HttpServlet()
	 */
	public WriteToCacheController() {
		super();
		// TODO Auto-generated constructor stub
	}

	/**
	 * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse
	 *      response)
	 */
	protected void doGet(HttpServletRequest request,
			HttpServletResponse response) throws ServletException, IOException {
		// TODO Auto-generated method stub
	}

	/**
	 * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse
	 *      response)
	 */
	protected void doPost(HttpServletRequest request,
			HttpServletResponse response) throws ServletException, IOException {
		// TODO Auto-generated method stub

		String productId = request.getParameter("prod_id");
		String productName = request.getParameter("productName");
		String unitPrice = request.getParameter("unitPrice");
		String Qunatity = request.getParameter("totQty");
		String sql = "insert into product values(?,?,?,?)";
		try {
			connection = dataSource.getConnection();
			System.out.println("-------------- Connection Closed Status: "
					+ connection.isClosed() + " ------------------");
			preparedStatement = connection.prepareStatement(sql);
			preparedStatement.setString(1, productId);
			preparedStatement.setString(2, productName);
			preparedStatement.setString(3, unitPrice);
			preparedStatement.setString(4, Qunatity);

			preparedStatement.execute();
			writeToCache(productId, productName, unitPrice, Qunatity);
			
			RequestDispatcher requestDispatcher = request.getRequestDispatcher("getRecord.jsp");
			requestDispatcher.forward(request, response);

		} catch (SQLException e) {
			System.out
					.println("--------------------------Exception no record insert into Database !---------------------");
			e.printStackTrace();
		}
		
		
	}

	/**
	 * @param productId
	 * @param productName
	 * @param unitPrice
	 * @param Qunatity
	 */
	public void writeToCache(String productId, String productName,
			String unitPrice, String Qunatity) {

		CacheFactory.ensureCluster();
		NamedCache namedCache = CacheFactory.getCache("product");
		System.out.println("connected to cluster: "
				+ CacheFactory.getCluster().getClusterName()
				+ " With cache name " + namedCache.getCacheName());

		Map<String, String> dbObject = new HashMap<String, String>();
		dbObject.put("productId", productId);
		dbObject.put("productName", productName);
		dbObject.put("unitPrice", unitPrice);
		dbObject.put("quantity", Qunatity);

		if (namedCache.get(productId) == null) {
			namedCache.put(productId, dbObject);
		}

		System.out
				.println("------------- Object got cached ! -----------------");
		
	}
	
}

Once we are done with the insertion of data to database and cache, now its time to fetch the data either from database or cache. here the logic is if data is available in cache the controller pull data from cache if not it will pull from database and also update cache with the latest data so that next time it will from cache but not from database.

<%@page import="java.util.Map"%>
<%@ page language="java" contentType="text/html; charset=ISO-8859-1"
	pageEncoding="ISO-8859-1"%>
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title>Insert title here</title>
</head>
<body>
	<form action="readCache" method="post">
		<table>
			<tr>
				<td>Product ID:</td>
				<td><input name="prod_id" id="prodid" autofocus required></td>
			</tr>
			<tr>
				<td><input type="submit" value="getDetails"></td>
			</tr>
		</table>
	</form>

	<%
		Map cacheObject = (Map) session.getAttribute("cacheObj");
		if (cacheObject != null) {
	%>
	<table>
		<thead>
			<tr>
				<td>Product ID</td>
				<td>Product Name</td>
				<td>Unit Price</td>
				<td>Quantity Ordered</td>
			</tr>
		</thead>
		<tbody>
			<tr>
				<td><%=cacheObject.get("productId")%></td>
				<td><%=cacheObject.get("productName")%></td>
				<td><%=cacheObject.get("unitPrice")%></td>
				<td><%=cacheObject.get("quantity")%></td>
			</tr>
		</tbody>
	</table>

	<%
		}
	%>
</body>
</html>

ReadFromCacheController.java:

/**
 * 
 */
package com.spark.coherence.cache.controller;

import java.io.IOException;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.Map;

import javax.annotation.Resource;
import javax.servlet.RequestDispatcher;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpSession;
import javax.sql.DataSource;

import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;

/**
 * @author Sony
 * 
 */
@WebServlet("/readCache")
public class ReadFromCacheController extends HttpServlet {

	@Resource(name = "oracleDS")
	DataSource dataSource;

	Connection connection;
	PreparedStatement preparedStatement;
	/* (non-Javadoc)
	 * @see javax.servlet.http.HttpServlet#doGet(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse)
	 */
	@Override
	protected void doGet(HttpServletRequest req, HttpServletResponse resp)
			throws ServletException, IOException {
		// TODO Auto-generated method stub
		super.doGet(req, resp);
	}

	/* (non-Javadoc)
	 * @see javax.servlet.http.HttpServlet#doPost(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse)
	 */
	protected void doPost(HttpServletRequest request, HttpServletResponse response)
			throws ServletException, IOException {
		
		CacheFactory.ensureCluster();
		NamedCache namedCache = CacheFactory.getCache("product");
		
		String productId = request.getParameter("prod_id");
		HttpSession httpSession = request.getSession();
		Map<String, String> cacheObject = (Map)namedCache.get(productId);
		
		if(cacheObject != null){
			System.out.println("----------- Read from cache -------------");
			httpSession.setAttribute("cacheObj", cacheObject);
			RequestDispatcher requestDispatcher = request.getRequestDispatcher("getRecord.jsp");
			requestDispatcher.forward(request, response);
		}else{
			System.out.println("----------- Read from db and update cache -------------");
			httpSession.setAttribute("cacheObj", readFromDatabase(productId));
			RequestDispatcher requestDispatcher = request.getRequestDispatcher("getRecord.jsp");
			requestDispatcher.forward(request, response);
		}
	}
	
	public Map<String, String> readFromDatabase(String productId){
		
		String sql = "select * from product where product_id = ?";
		Map<String, String> dbObject = new HashMap<String, String>();
		
		try {
			connection = dataSource.getConnection();
			preparedStatement = connection.prepareStatement(sql);
			preparedStatement.setString(1, productId);
			ResultSet resultSet = preparedStatement.executeQuery();
			
			while (resultSet.next()) {
				dbObject.put("productId", resultSet.getString("product_id"));
				dbObject.put("productName", resultSet.getString("product_name"));
				dbObject.put("unitPrice", resultSet.getString("unit_price"));
				dbObject.put("quantity", resultSet.getString("quantity"));
			}
			
		}catch(SQLException sqlException){
			sqlException.printStackTrace();
		}
		writeToCache(dbObject,productId);
		return dbObject;
	}
	
	/**
	 * @param productId
	 * @param productName
	 * @param unitPrice
	 * @param Qunatity
	 */
	public void writeToCache(Map<String, String> dbObject,String productId) {

		CacheFactory.ensureCluster();
		NamedCache namedCache = CacheFactory.getCache("product");
		
		if (namedCache.get(productId) == null) {
			namedCache.put(productId, dbObject);
		}
		
	}
}

now that all coding is done now here is some config related to project, first i have implemented a ServletListener
for notifying contextDestroyed,contextInitialized events

now in order to connect with our coherence server the following two config files are required.

coherence-config.xml

<?xml version="1.0"?>
<!DOCTYPE cache-config SYSTEM "cache-config.dtd">
<cache-config>
    <caching-scheme-mapping>
        <cache-mapping>
            <cache-name>product</cache-name>
            <scheme-name>example-distributed</scheme-name>
        </cache-mapping>
    </caching-scheme-mapping>
 
    <caching-schemes>
        <distributed-scheme>
            <scheme-name>example-distributed</scheme-name>
            <service-name>DistributedCache</service-name>
            <backing-map-scheme>
                <local-scheme />
            </backing-map-scheme>
            <autostart>true</autostart>
        </distributed-scheme>
    </caching-schemes>
</cache-config>

tangosol-coherence-override.xml

<?xml version="1.0" encoding="UTF-8"?>
<coherence
	xmlns="http://xmlns.oracle.com/coherence/coherence-operational-config"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-operational-config http://xmlns.oracle.com/coherence/coherence-operational-config/1.2/coherence-operational-config.xsd">
	<cluster-config>
		<member-identity>
			<cluster-name>MyCluster</cluster-name>
		</member-identity>
		<unicast-listener>
			<address system-property="tangosol.coherence.localhost">192.168.0.100</address>
			<port system-property="tangosol.coherence.localport">8088</port>
		</unicast-listener>
		<multicast-listener>
			<time-to-live system-property="tangosol.coherence.ttl">60</time-to-live>
		</multicast-listener>
	</cluster-config>

	<configurable-cache-factory-config>
		<init-params>
			<init-param>
				<param-type>java.lang.String</param-type>
				<param-value system-property="tangosol.coherence.cacheconfig">
					coherence-cache-config.xml
				</param-value>
			</init-param>
		</init-params>
	</configurable-cache-factory-config>
</coherence>

Now as a final step lets run the application by deploying to weblogic server and below are the screen shots.

landing-page

now fill the details and submit the form and it should look as shown.

after-save

If suppose we query any record that is not available in cache it should get that from database and update cache as shown below.
read_and_update

if you query any record that is available in cache, it should directly read from cache and it will not hit database.
read_direct_cache

Advantage of implementing cache technology is, it will drastically improve the performance of the applications.
Happy cache 🙂

Handling Oracle Stored Procedures with Arrays or Table Types in Spring

Hello guys, in this post lets discuss how to handle oracle procedure that take custom or array types as input params and return array type or table type as out param.

this example has been tested with spring 3.2.x and with oracle 10g xe.
note : run the below scripts in oracle to create objects prior to coding java.

-- custom type
create or replace TYPE "KEYVALUEOBJECT"
as object(name varchar(255),
value varchar(255));
================================================================

-- array of KeyValueObject
create or replace
TYPE "MY_OBJECT_ARRAY"
as table of KeyValueObject;

================================================================
-- this proc, doesn't do too much
create or replace
procedure Sample_Proc(
i_array in MY_OBJECT_ARRAY,
o_array out MY_OBJECT_ARRAY)
as
begin
o_array := MY_OBJECT_ARRAY();
for i in 1 .. i_array.count loop

o_array.extend;
o_array(i) := KeyValueObject(i_array(i).name, i_array(i).value);
end loop;
end;

applicationContext.java

<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:p="http://www.springframework.org/schema/p"
	xmlns:jms="http://www.springframework.org/schema/jms" xmlns:context="http://www.springframework.org/schema/context"
	xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/jms http://www.springframework.org/schema/jms/spring-jms-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">

	<bean id="propertyPlaceholderConfigurer"
		class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
		<property name="locations">
			<list>
				<value>jdbc.properties</value>
			</list>
		</property>
	</bean>

	<bean id="springDataSource"
		class="org.springframework.jdbc.datasource.SingleConnectionDataSource">
		<property name="driverClassName" value="${db.driver}" />
		<property name="url" value="${db.url}" />
		<property name="username" value="${db.user}" />
		<property name="password" value="${db.password}" />
	</bean>

	<bean id="keyValueDao" class="com.spring.jdbc.dao.tab.type.impl.KeyValueDaoImpl">
		<constructor-arg name="dataSource" ref="springDataSource"></constructor-arg>
	</bean>

</beans>

KeyValueDao.java

package com.spring.jdbc.dao;

import com.sparing.jdbc.pojo.KeyValueObject;

/**
 * @author Sony
 *
 */
public interface KeyValueDao {

	public KeyValueObject[] getKeyValueObjects(KeyValueObject[] keyValueObjects);
	
}

KeyValueDaoImpl.java

/**
 * 
 */
package com.spring.jdbc.dao.tab.type.impl;

import javax.sql.DataSource;

import com.sparing.jdbc.pojo.KeyValueObject;
import com.spring.jdbc.dao.KeyValueDao;

/**
 * @author Sony
 *
 */
public class KeyValueDaoImpl extends TableTypeBaseProcedure implements KeyValueDao{

	public KeyValueDaoImpl(DataSource dataSource) {
		super(dataSource);
	}

	@Override
	public KeyValueObject[] getKeyValueObjects(KeyValueObject[] keyValueObjects) {
		
		return execute(keyValueObjects);
		
	}

}

the main business logic exist in this class.
TableTypeBaseProcedure.java

/**
 * 
 */
package com.spring.jdbc.dao.tab.type.impl;

import java.sql.CallableStatement;
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Types;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;

import javax.sql.DataSource;

import oracle.sql.ARRAY;

import org.springframework.dao.DataRetrievalFailureException;
import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.core.SqlReturnType;
import org.springframework.jdbc.object.StoredProcedure;

import com.sparing.jdbc.pojo.KeyValueObject;

/**
 * @author Sony
 * 
 */
public class TableTypeBaseProcedure extends StoredProcedure {

	private static final String PROC_NAME = "Sample_Proc";
	private static final String TAB_TYPE = "KEYVALUEOBJECT";
	
	private static final String MY_ARRAY = "MY_OBJECT_ARRAY";
	private static final String I_ARRAY = "input_array";
	private static final String O_ARRAY = "output_array";


	public TableTypeBaseProcedure(DataSource dataSource) {
		
		super(dataSource, PROC_NAME);
		
		declareParameter(new SqlParameter(I_ARRAY, Types.ARRAY, MY_ARRAY));
		declareParameter(new SqlOutParameter(O_ARRAY, Types.ARRAY, MY_ARRAY,
				new SqlReturnType() {

					@Override
					public Object getTypeValue(CallableStatement cs,
							int paramIndex, int sqlType, String typeName)
							throws SQLException {
						Connection connection = cs.getConnection();
						Map<String, Class<?>> typeMap = connection.getTypeMap();
						typeMap.put(TAB_TYPE, KeyValueObject.class);
						return cs.getObject(paramIndex);
					}
				}));
		compile();
	}

	public KeyValueObject[] execute(KeyValueObject[] keyValueObjects) {
		Map<String, Object> params = new HashMap<>();
		params.put(I_ARRAY, new KeyValueSqlType(keyValueObjects));

		Map<?, ?> result = execute(params);

		if ((!result.containsKey(O_ARRAY) || result.get(O_ARRAY) == null)) {
			return null;
		}

		try {
			Object[] resultArray = (Object[]) ((ARRAY) result.get(O_ARRAY))
					.getArray();

			return Arrays.copyOf(resultArray, resultArray.length,
					KeyValueObject[].class);
		} catch (SQLException e) {
			throw new DataRetrievalFailureException("Unable to retrieve array",
					e);
		}
	}

}

this is a helper class
KeyValueSqlType.java

/**
 * 
 */
package com.spring.jdbc.dao.tab.type.impl;

import java.sql.Connection;
import java.sql.SQLException;

import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;

import org.springframework.jdbc.core.support.AbstractSqlTypeValue;

import com.sparing.jdbc.pojo.KeyValueObject;

/**
 * @author Sony
 *
 */
public class KeyValueSqlType extends AbstractSqlTypeValue{

	private KeyValueObject[] keyValueObjects;
	
	public KeyValueSqlType(KeyValueObject[] keyValueObjects) {
		this.keyValueObjects = keyValueObjects;
	}
	
	@Override
	protected Object createTypeValue(Connection connection, int sqlType,
			String typeName) throws SQLException {
		
		ArrayDescriptor arrayDescriptor = new ArrayDescriptor(typeName, connection);
		return new ARRAY(arrayDescriptor,connection,keyValueObjects);
	}

}

The main domain object that will flow in and out to database.
KeyValueObject.java

/**
 * 
 */
package com.sparing.jdbc.pojo;

import java.sql.SQLData;
import java.sql.SQLException;
import java.sql.SQLInput;
import java.sql.SQLOutput;

import org.apache.commons.lang.builder.ToStringBuilder;
import org.apache.commons.lang.builder.ToStringStyle;

/**
 * @author Sony
 * 
 */
public class KeyValueObject implements SQLData {

	private String name;
	private String value;

	/**
	 * @return the name
	 */
	public String getName() {
		return name;
	}

	/**
	 * @param name
	 *            the name to set
	 */
	public void setName(String name) {
		this.name = name;
	}

	/**
	 * @return the value
	 */
	public String getValue() {
		return value;
	}

	/**
	 * @param value
	 *            the value to set
	 */
	public void setValue(String value) {
		this.value = value;
	}

	@Override
	public String getSQLTypeName() throws SQLException {
		return "KeyValueObject";
	}

	@Override
	public void readSQL(SQLInput stream, String typeName) throws SQLException {

		name = stream.readString();
		value = stream.readString();
	}

	@Override
	public void writeSQL(SQLOutput stream) throws SQLException {

		stream.writeString(name);
		stream.writeString(value);
	}

	@Override
	public String toString() {
		return ToStringBuilder.reflectionToString(this,
				ToStringStyle.SHORT_PREFIX_STYLE);
	}

}

Happy Coding 🙂
Happy Oracle Stored Procedures with Arrays or Table Types in Spring 🙂