Oracle Fusion Middleware – “Concepts of Oracle Coherence”

Hi all, if you are dealing with large applications with loads of write and read operations then each call to database will cost you so much in the performance prospective. Thinking of the above point now a days cache driven applications are given more and more importance. In this tutorial i will taking about the “Oracle Coherence”.

Note : This article assume that you already have the coherence downloaded and you know how to run a cache-server.

These days its very common to pre-populate the cache before the application starts using the data. In this example i am going to pre-popuate the cache.

Oracle Coherence revolves around the concept of named caches. The first thing you need to do in your code when working with Coherence is to obtain a reference to a named cache you want to work with. In order to do this, you
need to use the CacheFactory class, which exposes the getCache method as one of its public members.

below is the project structure.
coh-proj-struct

first of all let us write the configurations.
Step 1: applicationContext.xml – this file has the spring capability to instantiate the beans. In this i have written the DataSource object instantiation.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">

	<bean id="dataSource"
		class="org.springframework.jdbc.datasource.DriverManagerDataSource">
		<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
		<property name="url" value="jdbc:oracle:thin:@localhost:1521:xe" />
		<property name="username" value="scott" />
		<property name="password" value="tiger" />
	</bean>
</beans>

let us start configuring our custom cache by creating coherence-cache-config.xml

<cache-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://xmlns.oracle.com/coherence/coherence-cache-config"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-cache-config http://xmlns.oracle.com/coherence/coherence-cache-config/1.0/coherence-cache-config.xsd">	<!-- The defaults element defines factory-wide default settings. -->

	<defaults>
		<serializer system-property="tangosol.coherence.serializer" />
		<socket-provider system-property="tangosol.coherence.socketprovider" />
	</defaults>

	<caching-scheme-mapping>
		<cache-mapping>
			<cache-name>dist-*</cache-name>
			<scheme-name>example-distributed</scheme-name>
			<init-params>
				<init-param>
					<param-name>back-size-limit</param-name>
					<param-value>8MB</param-value>
				</init-param>
			</init-params>
		</cache-mapping>

		<cache-mapping>
			<cache-name>*</cache-name>
			<scheme-name>example-distributed</scheme-name>
		</cache-mapping>
	</caching-scheme-mapping>

	<caching-schemes>
		<!-- Distributed caching scheme. -->
		<distributed-scheme>
			<scheme-name>example-distributed</scheme-name>
			<service-name>DistributedCache</service-name>
			<serializer>
				<instance>
					<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
					<init-params>
						<init-param>
							<param-type>String</param-type>
							<param-value>custom-pof-config.xml</param-value>
						</init-param>
					</init-params>
				</instance>
			</serializer>
			<backing-map-scheme>
				<read-write-backing-map-scheme>
					<internal-cache-scheme>
						<local-scheme>
						</local-scheme>
					</internal-cache-scheme>
				</read-write-backing-map-scheme>
			</backing-map-scheme>
			<autostart>true</autostart>
		</distributed-scheme>
	</caching-schemes>
</cache-config>

for every bean that we write we need to add an entry in the below file custom-pof-config.xml

<?xml version="1.0"?>
<pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config http://xmlns.oracle.com/coherence/coherence-pof-config/1.0/coherence-pof-config.xsd">
	<user-type-list>
		<!-- include all "standard" Coherence POF user types -->
		<include>coherence-pof-config.xml</include>
		<user-type>
			<type-id>1000</type-id>
			<class-name>com.spark.coherence.pof.beans.EmployeeBean</class-name>
		</user-type>
	</user-type-list>
</pof-config>

we need to override the tangosol-coherence-override.xml file with our config file entries as below.

<?xml version='1.0'?>
<coherence
	xmlns="http://xmlns.oracle.com/coherence/coherence-operational-config"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-operational-config http://xmlns.oracle.com/coherence/coherence-operational-config/1.2/coherence-operational-config.xsd">
	<cluster-config>
		<!-- <member-identity> <cluster-name>my-coherance-cluster</cluster-name> 
			</member-identity> -->

		<multicast-listener>
			<address>224.12.1.0</address>
			<port>12100</port>
			<time-to-live>60</time-to-live>
		</multicast-listener>
	</cluster-config>

	<configurable-cache-factory-config>
		<init-params>
			<init-param>
				<param-type>java.lang.String</param-type>
				<param-value system-property="tangosol.coherence.cacheconfig">
					coherence-cache-config.xml
				</param-value>
			</init-param>
		</init-params>
	</configurable-cache-factory-config>
</coherence>

Once done with our configuration file, let us start writing the jdbc connection factory class.

/**
 * 
 */
package com.spark.coherence.jdbc.connection.factory;

import java.sql.Connection;
import java.sql.SQLException;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;

import javax.sql.DataSource;

import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

/**
 * @author Sony
 *
 */
public class CoherenceJdbcConnectionFactory {

	private static Future<ApplicationContext> future;
	private static ExecutorService executorService = Executors.newFixedThreadPool(1);
	
	static{
		future = executorService.submit(new Callable<ApplicationContext>() {
			@Override
			public ApplicationContext call() throws Exception {
				return new ClassPathXmlApplicationContext("applicationContext.xml");
			}
		});
	}
	
	/**
	 * @return java.sql.Connection
	 * @throws InterruptedException
	 * @throws ExecutionException
	 * @throws BeansException
	 * @throws SQLException
	 */
	public static Connection getJdbcConnection() throws InterruptedException, ExecutionException, BeansException, SQLException{
		ApplicationContext applicationContext = future.get();
		return ((DataSource)applicationContext.getBean("dataSource")).getConnection();
	}
	
	/**
	 * @return javax.sql.DataSource
	 * @throws InterruptedException
	 * @throws ExecutionException
	 * @throws BeansException
	 * @throws SQLException
	 */
	public static DataSource getJdbcDataSource() throws InterruptedException, ExecutionException, BeansException, SQLException{
		ApplicationContext applicationContext = future.get();
		return ((DataSource)applicationContext.getBean("dataSource"));
	}
	
	private static ExecutorService getCurrentExecutorService(){
		return executorService;
	}
	
	public static void shutDownCurrentExecutorService(){
		getCurrentExecutorService().shutdown();
	}
}

Now let us create pojo classes, there are two classes explained here one is Entity.java and other is EmployeeBean.java. The Entity.java is used as Key for the EmployeeBean objects

/**
 * 
 */
package com.spark.coherence.pof.beans;

import java.io.IOException;

import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;

/**
 * @author Sony
 * 
 */
public class Entity<T> implements PortableObject{

	public T key;

	/**
	 * @return the key
	 */
	public T getKey() {
		return key;
	}

	/**
	 * @param key
	 *            the key to set
	 */
	public void setKey(T key) {
		this.key = key;
	}

	@SuppressWarnings("unchecked")
	@Override
	public void readExternal(PofReader pofReader) throws IOException {
		this.key = (T) pofReader.readObject(0);
	}

	@Override
	public void writeExternal(PofWriter pofWriter) throws IOException {
		pofWriter.writeObject(0, key);
	}

}
/**
 * 
 */
package com.spark.coherence.pof.beans;

import java.io.IOException;
import java.util.Date;

import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;

/**
 * @author Sony
 * 
 */
public class EmployeeBean extends Entity<Integer> {

	private int employeeId;
	private String employeeName;
	private String job;
	private int managerId;
	private Date hireDate;
	private int salary;
	private int commission;
	private int deptNo;

	public EmployeeBean() {
		// TODO Auto-generated constructor stub
	}

	public EmployeeBean(int employeeId) {
		super.setKey(employeeId);
		this.employeeId = employeeId;
	}

	/**
	 * @return the employeeId
	 */
	public int getEmployeeId() {
		return employeeId;
	}

	/**
	 * @param employeeId
	 *            the employeeId to set
	 */
	public void setEmployeeId(int employeeId) {
		this.employeeId = employeeId;
	}

	/**
	 * @return the employeeName
	 */
	public String getEmployeeName() {
		return employeeName;
	}

	/**
	 * @param employeeName
	 *            the employeeName to set
	 */
	public void setEmployeeName(String employeeName) {
		this.employeeName = employeeName;
	}

	public String getJob() {
		return job;
	}

	public void setJob(String job) {
		this.job = job;
	}

	public int getManagerId() {
		return managerId;
	}

	public void setManagerId(int managerId) {
		this.managerId = managerId;
	}

	public Date getHireDate() {
		return hireDate;
	}

	public void setHireDate(Date hireDate) {
		this.hireDate = hireDate;
	}

	public int getSalary() {
		return salary;
	}

	public void setSalary(int salary) {
		this.salary = salary;
	}

	public int getCommission() {
		return commission;
	}

	public void setCommission(int commission) {
		this.commission = commission;
	}

	public int getDeptNo() {
		return deptNo;
	}

	public void setDeptNo(int deptNo) {
		this.deptNo = deptNo;
	}

	@Override
	public void readExternal(PofReader pofReader) throws IOException {
		this.employeeId = pofReader.readInt(0);
		this.employeeName = pofReader.readString(1);
		this.job = pofReader.readString(2);
		this.managerId = pofReader.readInt(3);
		this.hireDate = pofReader.readDate(4);
		this.salary = pofReader.readInt(5);
		this.commission = pofReader.readInt(6);
		this.deptNo = pofReader.readInt(7);
	}

	@Override
	public void writeExternal(PofWriter pofWriter) throws IOException {
		pofWriter.writeInt(0, employeeId);
		pofWriter.writeString(1, employeeName);
		pofWriter.writeString(2, job);
		pofWriter.writeInt(3, managerId);
		pofWriter.writeDate(4, hireDate);
		pofWriter.writeInt(5, salary);
		pofWriter.writeInt(6, commission);
		pofWriter.writeInt(7, deptNo);
	}

}

Let’s start with writing logic to fetch from Oracle database and put that into Map

/**
 * 
 */
package com.spark.coherence.jdbc.dao.impl;

import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import javax.sql.DataSource;

import org.springframework.dao.DataAccessException;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.ResultSetExtractor;

import com.spark.coherence.pof.beans.EmployeeBean;

/**
 * @author Sony
 *
 */
public class EmployeeDao {

	private JdbcTemplate jdbcTemplate;

	public EmployeeDao(DataSource dataSource) {
		this.jdbcTemplate = new JdbcTemplate(dataSource);
	}

	public Map<Integer, EmployeeBean> getAllEmployees() {
		String sql = "select * from emp";

		Map<Integer, EmployeeBean> map = this.jdbcTemplate.query(sql,
				new ResultSetExtractor<Map<Integer, EmployeeBean>>() {

					Map<Integer, EmployeeBean> map = new HashMap<>();

					public java.util.Map<Integer, EmployeeBean> extractData(
							ResultSet resultSet) throws SQLException,
							DataAccessException {
						
						while (resultSet.next()) {
							EmployeeBean employeeBean = new EmployeeBean(
									resultSet.getInt("empno"));
							employeeBean.setEmployeeName(resultSet
									.getString("ename"));
							employeeBean.setJob(resultSet.getString("job"));
							employeeBean.setCommission(resultSet.getInt("comm"));
							employeeBean.setDeptNo(resultSet.getInt("deptno"));
							employeeBean.setHireDate(resultSet
									.getDate("hiredate"));
							employeeBean.setManagerId(resultSet.getInt("mgr"));
							employeeBean.setSalary(resultSet.getInt("sal"));

							map.put(employeeBean.getKey(), employeeBean);
						}

						return map;

					};
				});

		return map;
	}
}

Let us create a CacheRepository class from which we get the NamedCache instance as shown below.

/**
 * 
 */
package com.spark.coherence.cache.repository;

import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;

/**
 * @author Sony
 *
 */
public class CacheRepository {

	public static NamedCache getEmployeeCache(){
		return CacheFactory.getCache("dist-employee-cache");
	}
}

Now its time for to call the service layer and then put the data to cache.

/**
 * 
 */
package com.spark.coherence.cache.service.impl;

import java.sql.SQLException;
import java.util.Map;
import java.util.concurrent.ExecutionException;

import org.springframework.beans.BeansException;

import com.spark.coherence.cache.repository.CacheRepository;
import com.spark.coherence.jdbc.connection.factory.CoherenceJdbcConnectionFactory;
import com.spark.coherence.jdbc.dao.impl.EmployeeDao;
import com.spark.coherence.pof.beans.EmployeeBean;
import com.tangosol.net.NamedCache;


/**
 * @author Sony
 *
 */
public class CacheService {

	public void preLoadCache(){
		try {
			Map<Integer, EmployeeBean> map = new EmployeeDao(CoherenceJdbcConnectionFactory.getJdbcDataSource()).getAllEmployees();
			NamedCache namedCache = CacheRepository.getEmployeeCache();
			namedCache.putAll(map);
		} catch (BeansException | InterruptedException | ExecutionException
				| SQLException e) {
			e.printStackTrace();
		}
	}
}

That is it we have done with coding part its now time for us to test the application by running the main method.

/**
 * 
 */
package com.spark.main;

import java.util.ArrayList;
import java.util.Collection;

import com.spark.coherence.cache.repository.CacheRepository;
import com.spark.coherence.cache.service.impl.CacheService;
import com.spark.coherence.jdbc.connection.factory.CoherenceJdbcConnectionFactory;
import com.spark.coherence.pof.beans.EmployeeBean;
import com.tangosol.net.NamedCache;


/**
 * @author Sony
 *
 */
public class CoherenceTest {

	/**
	 * @param args
	 */
	@SuppressWarnings({ "unused", "unchecked" })
	public static void main(String[] args) {
		NamedCache namedCache = CacheRepository.getEmployeeCache();
		System.out.println(namedCache.getCacheService().getInfo().getServiceName());
		
		new CacheService().preLoadCache();
		EmployeeBean employeeBean = (EmployeeBean)namedCache.get(7782);
		System.out.println(employeeBean.getEmployeeName());

		Collection<Integer> ids = new ArrayList<>();
		ids.add(7654);
		ids.add(7698);
		
		Collection<EmployeeBean> employeeBeans = namedCache.getAll(ids).values();
		for (EmployeeBean employeeBean2 : employeeBeans) {
			System.out.println(employeeBean2.getEmployeeName());
		}
		
		CoherenceJdbcConnectionFactory.shutDownCurrentExecutorService();
	}

}

Please observer the main class carefully i have demonstrated the two public methods get(Object) and getAll(Collection). In my next post i will be demonstrating to use the CacheListeners to know if cache is update,deleted or inserted.

Happy Coherence and Happy Coding 🙂

Advertisements

Using a Bean Data Control in Oracle ADF 11g

Hi All, here is small tutorial in Oracle ADF technology, how to use Data Controls. Basically there are three different types of data controls in ADF

  • Bean Data Control
  • URL service Data Control
  • PlaceHolder Data Control

In this tutorial i am going to elaborate on Bean Data control.this tutorial shows a simple databound application using JavaBeans and Oracle ADF Faces.

Step 1: Create a New Application and Project

From the main menu, choose File > New. In the New Gallery, expand the General category and select Applications. Then in the Items list, select Custom Application and click OK.

new_project

2. To follow along with the example, enter DataBoundADF as the application name and click Next.
3. Enter Model as the project na
4. Click Finish.

The Projects panel in the Application Navigator should look like this:
Model

Step 2: Create a Simple JavaBean Class

In the Application Navigator, right-click the project you just created and choose New > General > Java Class, then click OK.

new_javaclass

2.In the Create Java Class dialog, enter Contact as the class name, and com.spark.adf.tutorial as the package name. Accept the default values and click OK.

3.In the source editor, add code to create a simple JavaBean class.

package com.spark.adf.tutorial;

/**
 * @author Pavan Kumar Mantha
 */
public class Contact {

    private String name;
    private String email;

    public Contact() {
        super();
    }

    public Contact(String name, String email) {
        this.name = name;
        this.email = email;
    }

    /**
     *
     * @param name
     */
    public void setName(String name) {
        this.name = name;
    }

    /**
     *
     * @return
     */
    public String getName() {
        return name;
    }

    public void setEmail(String email) {
        this.email = email;
    }

    public String getEmail() {
        return email;
    }
}

Click saveall icon Save All to save your work.

In the Application Navigator, you should see Contact.java in the com.spark.adf.tutorial package in the Application Sources folder.

Step 3: Create a Service Class

In the Application Navigator, right-click the Model project and choose New > General > Java Class, then click OK.

In the Create Java Class dialog, enter AddressBook as the class name. Accept the package name as com.spark.adf.tutorial.service, and the remaining default values, then click OK.

3.In the source editor, add code to create a collection Java class.
Delete all the generated code and replace with the following code:

package com.spark.adf.tutorial.service;


import com.spark.adf.tutorial.Contact;

import java.util.ArrayList;
import java.util.List;
import java.util.regex.Pattern;

/**
 * @author Pavan Kumar Mantha
 */
public class AddressBook {

    List<Contact> contacts = new ArrayList<Contact>();

    public AddressBook() {
        super();
        contacts.add(new Contact("Steve", "steve@acme.org"));
        contacts.add(new Contact("Charles", "cyoung@acme.org"));
        contacts.add(new Contact("Karl", "kheint@acme.org"));
        contacts.add(new Contact("Mik", "mik_meir@acme.org"));
        contacts.add(new Contact("Yvonne", "yvonne_yvonne@acme.org"));
        contacts.add(new Contact("Sung", "superstar001@acme.org"));
        contacts.add(new Contact("Shailyn", "spatellina@acme.org"));
        contacts.add(new Contact("John", "jjb@acme.org"));
        contacts.add(new Contact("Ricky", "rmartz@acme.org"));
        contacts.add(new Contact("Shaoling", "shaoling@acme.org"));
        contacts.add(new Contact("Olga", "olga077@acme.org"));
        contacts.add(new Contact("Ron", "regert@acme.org"));
        contacts.add(new Contact("Juan", "jperen@acme.org"));
        contacts.add(new Contact("Uday", "udaykum@acme.org"));
        contacts.add(new Contact("Amin", "amin@acme.org"));
        contacts.add(new Contact("Sati", "sparek@acme.org"));
        contacts.add(new Contact("Kal", "kalyane.kushnane@acme.org"));
        contacts.add(new Contact("Prakash", "prakash01@acme.org"));
    }

    public List<Contact> findAllContacts() {
        return contacts;
    }

    public List<Contact> findContactByName(String name) {

        String namePattern = ".*" + (name != null ? name.toUpperCase() : "") + ".*";
        List<Contact> l_contacts = new ArrayList<Contact>();
        for (Contact contact : contacts) {
            if (Pattern.matches(namePattern, contact.getName().toUpperCase())) {
                l_contacts.add(contact);
            }
        }
        return l_contacts;
    }
}

4. Click saveall icon Save All to save your work.

Step 4: Create a Data Control for the Service Class
1. In the Application Navigator, right-click AddressBook.java and choose Create Data Control.

DataControls

2.In the Bean Data Control Interface Chooser dialog, click OK without selecting an option.

JDeveloper adds the data control definition file (DataControls.dcx ) to the project, and opens the file in the overview editor.
dcx_overview

3.In the Application Navigator, expand the Data Controls panel, then expand AddressBook.
DataControls

Step 5: Create a JSF Page

1. From the main menu, choose File > New > General > Projects > Custom Project, then click OK.

new_genproj

2. Enter View as the project name. Then select ADF Faces from the Available list and shuttle icon shuttle it to the Selected list.

3. Click Finish.
You should see the View project in the Application Navigator.

4. In the Application Navigator, right-click the View project and choose New > Web Tier > JSF/Facelets > Page, then click OK

5. In the Create JSF Page dialog, enter ContactList.jsf as the file name. Make sure Facelets is the selected document type.

6. On the Page Layout page, select Blank Page. On the Managed Bean page, select Do Not Automatically Expose UI Components in a Managed Bean.

7. Click OK.
By default JDeveloper displays the new JSF Facelets page in the visual editor.

8. In the Component Palette, ADF Faces page, Layout panel, drag borderlayout icon Panel Stretch Layout and drop it on the blank page in the visual editor.

When you drag the component to the visual editor, you should see a target rectangle with the name Form on the page; this means the component you are dragging will be inserted inside that target component.

9. click save all icon to save the work.

Step 6: Bind an ADF Faces Table Component to the Service Bean
1. In the Data Controls panel, expand AddressBook, then expand findAllContacts().

dcpalette3

2. Click and drag Contact to the center facet on the page in the visual editor. From the Create context menu, choose Table > ADF Read-only Table.

createcontextmenu

3. In the Edit Table Columns dialog, select Enable Sorting.edittablecols

4. Accept the default values and click OK.
The page in the visual editor should look similar to this:

tableineditor

5.In the Application Navigator, right-click ContactList.jsf and choose Run.

If the Create Default Domain dialog displays, enter the default password, for example weblogic1, in the Password and Confirm Password fields, then click OK.

The page in the browser should look similar to this:
tableinbrowser1

I will teach how to bind “Bind to a Parameterized Method and a Parameter” in my next post.
Happy ADF data binding. 🙂