Primefaces 5 example

Here is an example on how to use jsf framework to develop a fully blown JavaEE application. when it comes to jsf all should know its a specification from sun and has been implemented by lot many vendors, some of them are “PrimeFaces”, “RichFaces”, “IceFaces”. The advantage of using JSF is the UI components are directly binded to Managed bean properties(value injection). In this example i want to show how to push data to database and retrieve from database to front end. This example is implemented in “Primefaces5.1”, “Maven”, “MySql”.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.spark.jsf.primefaces</groupId>
	<artifactId>PrimeFacesTutorial</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<packaging>war</packaging>
	<name>PrimeFacesTutorial</name>
	<description>PrimeFacesTutorial</description>

	<repositories>
		<repository>
			<id>prime-repo</id>
			<name>PrimeFaces Maven Repository</name>
			<url>http://repository.primefaces.org</url>
			<layout>default</layout>
		</repository>
	</repositories>

	<properties>
		<java.version>1.6</java.version>
		<servlet.version>3.0.1</servlet.version>
		<jsf.version>2.2.4</jsf.version>
		<primefaces.version>5.1</primefaces.version>
	</properties>

	<dependencies>
		<!-- Servlet -->
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>javax.servlet-api</artifactId>
			<version>${servlet.version}</version>
		</dependency>
		<!-- Faces Implementation -->
		<dependency>
			<groupId>com.sun.faces</groupId>
			<artifactId>jsf-impl</artifactId>
			<version>${jsf.version}</version>
		</dependency>
		<!-- Faces Library -->
		<dependency>
			<groupId>com.sun.faces</groupId>
			<artifactId>jsf-api</artifactId>
			<version>${jsf.version}</version>
		</dependency>
		<!-- Primefaces Version 5 -->
		<dependency>
			<groupId>org.primefaces</groupId>
			<artifactId>primefaces</artifactId>
			<version>${primefaces.version}</version>
		</dependency>
		<!-- JSP Library -->
		<dependency>
			<groupId>javax.servlet.jsp</groupId>
			<artifactId>javax.servlet.jsp-api</artifactId>
			<version>2.3.1</version>
		</dependency>
		<!-- JSTL Library -->
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>jstl</artifactId>
			<version>1.1.2</version>
		</dependency>

		<dependency>
			<groupId>mysql</groupId>
			<artifactId>mysql-connector-java</artifactId>
			<version>5.1.9</version>
		</dependency>
	</dependencies>

	<build>
		<sourceDirectory>src</sourceDirectory>
		<plugins>
			<plugin>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>${java.version}</source>
					<target>${java.version}</target>
				</configuration>
			</plugin>
			<plugin>
				<artifactId>maven-war-plugin</artifactId>
				<version>2.3</version>
				<configuration>
					<warSourceDirectory>webapp</warSourceDirectory>
					<failOnMissingWebXml>false</failOnMissingWebXml>
				</configuration>
			</plugin>
		</plugins>
		<finalName>primefaces-tutorial</finalName>
	</build>
</project>

web.xml

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://java.sun.com/xml/ns/javaee"
	xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd"
	id="WebApp_ID" version="3.0">
	<display-name>PrimeFaces Web Application</display-name>

	<!-- Change to "Production" when you are ready to deploy -->
	<context-param>
		<param-name>javax.faces.PROJECT_STAGE</param-name>
		<param-value>Development</param-value>
	</context-param>

	<context-param>
		<description>State saving method: 'client' or 'server' (=default). See
			JSF Specification 2.5.2</description>
		<param-name>javax.faces.STATE_SAVING_METHOD</param-name>
		<param-value>client</param-value>
	</context-param>

	<context-param>
		<param-name>javax.servlet.jsp.jstl.fmt.localizationContext</param-name>
		<param-value>resources.application</param-value>
	</context-param>

	<!-- Welcome page -->
	<welcome-file-list>
		<welcome-file>login.xhtml</welcome-file>
	</welcome-file-list>

	<!-- JSF mapping -->
	<servlet>
		<servlet-name>Faces Servlet</servlet-name>
		<servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
		<load-on-startup>1</load-on-startup>
	</servlet>

	<!-- Map these files with JSF -->
	<servlet-mapping>
		<servlet-name>Faces Servlet</servlet-name>
		<url-pattern>/faces/*</url-pattern>
	</servlet-mapping>
	<servlet-mapping>
		<servlet-name>Faces Servlet</servlet-name>
		<url-pattern>*.jsf</url-pattern>
	</servlet-mapping>
	<servlet-mapping>
		<servlet-name>Faces Servlet</servlet-name>
		<url-pattern>*.faces</url-pattern>
	</servlet-mapping>
	<servlet-mapping>
		<servlet-name>Faces Servlet</servlet-name>
		<url-pattern>*.xhtml</url-pattern>
	</servlet-mapping>

</web-app>

faces-config.xml

<?xml version="1.0" encoding="UTF-8"?>
<faces-config
    xmlns="http://xmlns.jcp.org/xml/ns/javaee"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/web-facesconfig_2_2.xsd"
    version="2.2">
	

</faces-config>

Once done with our configuration files let us write the page from which the operations are done.
login.xhtml

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml"
	xmlns:ui="http://java.sun.com/jsf/facelets"
	xmlns:f="http://java.sun.com/jsf/core"
	xmlns:h="http://java.sun.com/jsf/html"
	xmlns:p="http://primefaces.org/ui">

<h:head>
	<h:outputStylesheet name="css/custom-styles.css" />
</h:head>
<body>
	<h:form>
		<p:panelGrid columns="2">
			<p:outputLabel value="FirstName"></p:outputLabel>
			<p:inputText value="#{loginBean.firstName}"></p:inputText>

			<p:outputLabel value="LastName"></p:outputLabel>
			<p:inputText value="#{loginBean.lastName}"></p:inputText>

			<p:outputLabel value="Email"></p:outputLabel>
			<p:inputText value="#{loginBean.emailText}"></p:inputText>

			<p:outputLabel value="Password"></p:outputLabel>
			<p:password value="#{loginBean.password}"></p:password>

			<p:commandButton action="#{loginBean.saveValues()}" value="Submit"></p:commandButton>
		</p:panelGrid>

		<p:dataTable var="loginBean" value="#{loginBean.loginBeans}">
			<p:column headerText="First Name">
				<h:outputText value="#{loginBean.firstName}" />
			</p:column>

			<p:column headerText="Last Name">
				<h:outputText value="#{loginBean.lastName}" />
			</p:column>

			<p:column headerText="Email Id">
				<h:outputText value="#{loginBean.emailText}" />
			</p:column>

			<p:column headerText="Password">
				<h:outputText value="#{loginBean.password}" />
			</p:column>
		</p:dataTable>
	</h:form>

</body>
</html>

let us write a DTO in which we hold the data.

/**
 * 
 */
package com.spark.jsf.vo;

import javax.faces.bean.ManagedBean;
import javax.faces.bean.RequestScoped;
import javax.faces.bean.SessionScoped;

/**
 * @author Sony
 *
 */
@ManagedBean(name="loginVO")
@SessionScoped
public class LoginBeanVO {

	private String firstName;
	private String lastName;
	private String emailText;
	private String password;
	/**
	 * @return the firstName
	 */
	public String getFirstName() {
		return firstName;
	}
	/**
	 * @param firstName the firstName to set
	 */
	public void setFirstName(String firstName) {
		this.firstName = firstName;
	}
	/**
	 * @return the lastName
	 */
	public String getLastName() {
		return lastName;
	}
	/**
	 * @param lastName the lastName to set
	 */
	public void setLastName(String lastName) {
		this.lastName = lastName;
	}
	/**
	 * @return the emailText
	 */
	public String getEmailText() {
		return emailText;
	}
	/**
	 * @param emailText the emailText to set
	 */
	public void setEmailText(String emailText) {
		this.emailText = emailText;
	}
	/**
	 * @return the password
	 */
	public String getPassword() {
		return password;
	}
	/**
	 * @param password the password to set
	 */
	public void setPassword(String password) {
		this.password = password;
	}
	/* (non-Javadoc)
	 * @see java.lang.Object#hashCode()
	 */
	@Override
	public int hashCode() {
		final int prime = 31;
		int result = 1;
		result = prime * result
				+ ((emailText == null) ? 0 : emailText.hashCode());
		result = prime * result
				+ ((firstName == null) ? 0 : firstName.hashCode());
		result = prime * result
				+ ((lastName == null) ? 0 : lastName.hashCode());
		result = prime * result
				+ ((password == null) ? 0 : password.hashCode());
		return result;
	}
	/* (non-Javadoc)
	 * @see java.lang.Object#equals(java.lang.Object)
	 */
	@Override
	public boolean equals(Object obj) {
		if (this == obj)
			return true;
		if (obj == null)
			return false;
		if (getClass() != obj.getClass())
			return false;
		LoginBeanVO other = (LoginBeanVO) obj;
		if (emailText == null) {
			if (other.emailText != null)
				return false;
		} else if (!emailText.equals(other.emailText))
			return false;
		if (firstName == null) {
			if (other.firstName != null)
				return false;
		} else if (!firstName.equals(other.firstName))
			return false;
		if (lastName == null) {
			if (other.lastName != null)
				return false;
		} else if (!lastName.equals(other.lastName))
			return false;
		if (password == null) {
			if (other.password != null)
				return false;
		} else if (!password.equals(other.password))
			return false;
		return true;
	}
	
	
}

Here comes the major part of our code, its our backing bean which controls our jsf page. There are lot many concepts involved in the below code, we achieve dependency injection using @ManagedProperty annotation in the below code we have injected the dao and dto objects to backing bean using @ManagedProperty annotation , also we used ExternalContext object to redirect to the jsf page from managed bean.

/**
 * 
 */
package com.spark.jsf.managed.beans;

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import javax.annotation.PostConstruct;
import javax.faces.application.FacesMessage;
import javax.faces.bean.ManagedBean;
import javax.faces.bean.ManagedProperty;
import javax.faces.bean.SessionScoped;
import javax.faces.context.ExternalContext;
import javax.faces.context.FacesContext;
import javax.servlet.http.HttpServletRequest;

import org.primefaces.context.RequestContext;

import com.spark.jsf.dao.impl.LoginSaveDAOImpl;
import com.spark.jsf.vo.LoginBeanVO;
import com.spark.security.EncryptPassword;

/**
 * @author Sony
 *
 */
@ManagedBean(name = "loginBean")
@SessionScoped
public class LoginBean {

	private String firstName;
	private String lastName;
	private String emailText;
	private String password;
	private List<LoginBeanVO> loginBeans = new ArrayList<LoginBeanVO>();

	@ManagedProperty(value = "#{loginVO}")
	private LoginBeanVO loginBeanVO;

	@ManagedProperty(value = "#{loginSaveDao}")
	private LoginSaveDAOImpl loginSaveDAOImpl;

	/**
	 * @return the firstName
	 */
	public String getFirstName() {
		return firstName;
	}

	/**
	 * @param firstName the firstName to set
	 */
	public void setFirstName(String firstName) {
		this.firstName = firstName;
	}

	/**
	 * @return the lastName
	 */
	public String getLastName() {
		return lastName;
	}

	/**
	 * @param lastName the lastName to set
	 */
	public void setLastName(String lastName) {
		this.lastName = lastName;
	}

	/**
	 * @return the emailText
	 */
	public String getEmailText() {
		return emailText;
	}

	/**
	 * @param emailText the emailText to set
	 */
	public void setEmailText(String emailText) {
		this.emailText = emailText;
	}

	/**
	 * @return the password
	 */
	public String getPassword() {
		return password;
	}

	/**
	 * @param password the password to set
	 */
	public void setPassword(String password) {
		this.password = password;
	}

	/**
	 * @return the loginSaveDAOImpl
	 */
	public LoginSaveDAOImpl getLoginSaveDAOImpl() {
		return loginSaveDAOImpl;
	}

	/**
	 * @param loginSaveDAOImpl
	 *            the loginSaveDAOImpl to set
	 */
	public void setLoginSaveDAOImpl(LoginSaveDAOImpl loginSaveDAOImpl) {
		this.loginSaveDAOImpl = loginSaveDAOImpl;
	}

	/**
	 * @return the loginBeanVO
	 */
	public LoginBeanVO getLoginBeanVO() {
		return loginBeanVO;
	}

	/**
	 * @param loginBeanVO
	 *            the loginBeanVO to set
	 */
	public void setLoginBeanVO(LoginBeanVO loginBeanVO) {
		this.loginBeanVO = loginBeanVO;
	}

	/**
	 * @return the loginBeans
	 */
	public List<LoginBeanVO> getLoginBeans() {
		return loginBeans;
	}

	/**
	 * @param loginBeans the loginBeans to set
	 */
	public void setLoginBeans(List<LoginBeanVO> loginBeans) {
		this.loginBeans = loginBeans;
	}
	
	@PostConstruct
	public void getLoginDetails(){
		
		loginBeans = loginSaveDAOImpl.getLoginBeans();
	}

	public void saveValues() {

		loginBeanVO.setFirstName(firstName);
		loginBeanVO.setLastName(lastName);
		loginBeanVO.setEmailText(emailText);
		EncryptPassword encryptPassword = new EncryptPassword("");
		String encryptedPassword = encryptPassword.encrypt(password);

		loginBeanVO.setPassword(encryptedPassword);
		boolean status = loginSaveDAOImpl.saveCredentialsToDatabase(loginBeanVO);
		
		if(status){
			FacesMessage facesMessage = new FacesMessage(FacesMessage.SEVERITY_INFO, "Info", "User details saved successfully !");
			RequestContext.getCurrentInstance().showMessageInDialog(facesMessage);
			populateLoginBeanList();
			ExternalContext ec = FacesContext.getCurrentInstance().getExternalContext();
		    try {
				ec.redirect(((HttpServletRequest) ec.getRequest()).getRequestURI());
			} catch (IOException e) {
				e.printStackTrace();
			}
		}else{
			FacesMessage facesMessage = new FacesMessage(FacesMessage.SEVERITY_ERROR, "Error", "Failed to Save user, might be user with same email id already exist !");
			RequestContext.getCurrentInstance().showMessageInDialog(facesMessage);
		}
		
	}
	
	public void populateLoginBeanList(){
		loginBeans = loginSaveDAOImpl.getLoginBeans();
	}

}

here our dao layer logic.

/**
 * 
 */
package com.spark.jsf.dao.impl;

import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.List;

import javax.faces.bean.ManagedBean;
import javax.faces.bean.RequestScoped;
import javax.faces.bean.SessionScoped;

import com.spark.jsf.db.DBUtil;
import com.spark.jsf.vo.LoginBeanVO;

/**
 * @author Sony
 *
 */
@ManagedBean(name= "loginSaveDao")
@SessionScoped
public class LoginSaveDAOImpl {

	List<LoginBeanVO> loginBeanVOs;
	
	public boolean saveCredentialsToDatabase(LoginBeanVO loginBeanVO) {
		boolean status = false;
		System.out.println(loginBeanVO.getFirstName() + "  "
				+ loginBeanVO.getLastName() + "  " + loginBeanVO.getPassword());
		Connection connection = DBUtil.getDBConnection();
		System.out.println("Got the connection object: " + connection);
		String sql = "insert into login(first_name,last_name,email_id,password) values(?,?,?,?)";
		if (!checkUserWithEmailExist(connection, loginBeanVO.getEmailText())) {
			try {
				PreparedStatement preparedStatement = connection
						.prepareStatement(sql);
				preparedStatement.setString(1, loginBeanVO.getFirstName());
				preparedStatement.setString(2, loginBeanVO.getLastName());
				preparedStatement.setString(3, loginBeanVO.getEmailText());
				preparedStatement.setString(4, loginBeanVO.getPassword());
				preparedStatement.execute();
				System.out.println("--------------- Values Saved to Database ---------------");
				status = true;
			} catch (SQLException e) {
				e.printStackTrace();
			}
		} else {
			System.out
					.println("------------ user already exist with the same email id ---------------");
			status = false;
		}
		return status;
	}

	private boolean checkUserWithEmailExist(Connection connection,
			String emailText) {
		String sql = "select * from login where email_id = ?";
		boolean userExist = false;
		try {
			PreparedStatement preparedStatement = connection
					.prepareStatement(sql);
			preparedStatement.setString(1, emailText);
			ResultSet resultSet = preparedStatement.executeQuery();
			while (resultSet.next()) {
				userExist = true;
			}
		} catch (SQLException e) {
			e.printStackTrace();
		}
		return userExist;
	}
	
	public List<LoginBeanVO> getLoginBeans(){
		loginBeanVOs = new ArrayList<LoginBeanVO>();
		String sql = "select * from login";
		Connection connection = DBUtil.getDBConnection();
		try {
			PreparedStatement preparedStatement = connection
					.prepareStatement(sql);
			ResultSet resultSet = preparedStatement.executeQuery();
			while (resultSet.next()) {
				LoginBeanVO beanVO = new LoginBeanVO();
				beanVO.setFirstName(resultSet.getString("first_name"));
				beanVO.setLastName((resultSet.getString("last_name")));
				beanVO.setEmailText((resultSet.getString("email_id")));
				beanVO.setPassword((resultSet.getString("password")));
				loginBeanVOs.add(beanVO);
			}
		} catch (SQLException e) {
			e.printStackTrace();
		}
		return loginBeanVOs;
	}
}

The speciality of this example is that i have implemented the PBEWithMD5AndDES algorith for encrypting and decrypting the passwords for security purpose.
SecurityUtils.java

/**
 * 
 */
package com.spark.security;

import java.security.spec.AlgorithmParameterSpec;
import java.security.spec.KeySpec;

import javax.crypto.Cipher;
import javax.crypto.SecretKey;
import javax.crypto.SecretKeyFactory;
import javax.crypto.spec.PBEKeySpec;
import javax.crypto.spec.PBEParameterSpec;

/**
 * @author Sony
 *
 */
public class SecurityUtils {

	public static Cipher dCipher, eCipher;
	public SecurityUtils(String helperString) {
		//b-bit salt
		byte[] salt = {(byte) 0xA9, (byte) 0x9B, (byte) 0xC8, (byte) 0x32,
                (byte) 0x56, (byte) 0x34, (byte) 0xE3, (byte) 0x03};
		
		// iteration count
		int iterationCount = 19;
		try {
			// generating a temporary key, Encryptinng with DES using a pass key
			
			KeySpec keySpec = new PBEKeySpec(helperString.toCharArray(), salt, iterationCount);
			SecretKey secretKey = SecretKeyFactory.getInstance("PBEWithMD5AndDES").generateSecret(keySpec);
			
			eCipher = Cipher.getInstance(secretKey.getAlgorithm());
			dCipher = Cipher.getInstance(secretKey.getAlgorithm());
			
			AlgorithmParameterSpec algorithmParameterSpec = new PBEParameterSpec(salt, iterationCount);
			
			eCipher.init(Cipher.ENCRYPT_MODE, secretKey, algorithmParameterSpec);
			dCipher.init(Cipher.DECRYPT_MODE, secretKey, algorithmParameterSpec);
		} catch (Exception e) {
			e.printStackTrace();
		}
	}
}

EncryptPassword.java

/**
 * 
 */
package com.spark.security;

import java.io.UnsupportedEncodingException;

import javax.crypto.BadPaddingException;
import javax.crypto.IllegalBlockSizeException;

/**
 * @author Sony
 *
 */
public class EncryptPassword extends SecurityUtils{

	public EncryptPassword(String helperString) {
		super(helperString);
	}
	
	public String encrypt(String password){
		 try {
             // Encode the string into bytes using utf-8
             byte[] utf8 = password.getBytes("UTF8");
             // Encrypt
             byte[] enc = eCipher.doFinal(utf8);
             // Encode bytes to base64 to get a string
             return new sun.misc.BASE64Encoder().encode(enc);

      } catch (BadPaddingException e) {
    	  e.printStackTrace();
      } catch (IllegalBlockSizeException e) {
    	  e.printStackTrace();
      } catch (UnsupportedEncodingException e) {
    	  e.printStackTrace();
      }
      return null;
	}
}

DecryptPassword.java

/**
 * 
 */
package com.spark.security;

/**
 * @author Sony
 *
 */
public class DecryptPassword extends SecurityUtils {

	public DecryptPassword(String helperString) {
		super(helperString);
	}

	public String decrypt(String encryptedPassword) {
		try {
			// Decode base64 to get bytes
			byte[] dec = new sun.misc.BASE64Decoder()
					.decodeBuffer(encryptedPassword);
			// Decrypt
			byte[] utf8 = dCipher.doFinal(dec);
			// Decode using utf-8
			return new String(utf8, "UTF8");
		} catch (Exception e) {
			// TODO: handle exception
		}
		return null;
	}

}

Let us have a helper class which will create a connection object and return DAO layer.

/**
 * 
 */
package com.spark.jsf.db;

import java.sql.Connection;
import java.sql.DriverManager;

/**
 * @author Sony
 *
 */
public class DBUtil {

	private static Connection connection = null;
	
	private DBUtil(){
		try {
			Class.forName("com.mysql.jdbc.Driver");
			connection = DriverManager.getConnection("jdbc:mysql://localhost:3306/test", "root", "root");
		} catch (Exception e) {
			e.printStackTrace();
		}
	}
	
	public static Connection getDBConnection(){
		if(connection == null){
			return new DBUtil().getConnection();
		}else{
			return connection;
		}
	}

	/**
	 * @return the connection
	 */
	public Connection getConnection() {
		return connection;
	}

	/**
	 * @param connection the connection to set
	 */
	public  void setConnection(Connection connection) {
		this.connection = connection;
	}
	
}

finally the output is as below.

ui

Advertisements

Pushing Data to Apache Solr Using SpringData

From past few days i was working on Apache Solr Technologies to index data in one of my projects. Today i wanted to elaborate on this concept, this post assumes that you already have Apache Solr installed on your system, if not please install the latest( > 4.1) version from here. Read the instruction from solr site make the Solr server up and running. Once this Solr is up and running we can go to Admin dashboard by hitting the following url http://localhost:8983/solr.

Now its time for us to dive into Spring coding inorder to push data to solr server. this project is written on top of Spring4, SpringData, Solr4, Maven and Eclipse Luna.

first lets create a maven project and put the below code in your pom.xml file and the project structure looks as below.

SolrProjectStructure

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.spark.solr</groupId>
	<artifactId>SolrSpringIntegration</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<packaging>war</packaging>
	<name>SolrSpringIntegration</name>
	<description>SolrSpringIntegration</description>
	<properties>
		<jdk.version>1.6</jdk.version>
		<spring.version>4.0.4.RELEASE</spring.version>
		<spring.security.version>3.2.3.RELEASE</spring.security.version>
		<jstl.version>1.2</jstl.version>
		<spring-data-solr.verion>1.3.0.RELEASE</spring-data-solr.verion>
	</properties>

	<dependencies>

		<!-- Spring dependencies -->
		<dependency>
			<groupId>org.springframework</groupId>
			<artifactId>spring-core</artifactId>
			<version>${spring.version}</version>
		</dependency>

		<dependency>
			<groupId>org.springframework</groupId>
			<artifactId>spring-web</artifactId>
			<version>${spring.version}</version>
		</dependency>

		<dependency>
			<groupId>org.springframework</groupId>
			<artifactId>spring-webmvc</artifactId>
			<version>${spring.version}</version>
		</dependency>

		<dependency>
			<groupId>org.springframework</groupId>
			<artifactId>spring-oxm</artifactId>
			<version>${spring.version}</version>
		</dependency>

		<dependency>
			<groupId>org.springframework</groupId>
			<artifactId>spring-jdbc</artifactId>
			<version>${spring.version}</version>
		</dependency>

		<!-- SOLR -->
		<dependency>
			<groupId>org.springframework.data</groupId>
			<artifactId>spring-data-solr</artifactId>
			<version>${spring-data-solr.verion}</version>
		</dependency>

		<!-- jstl and servlet for jsp page -->
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>javax.servlet-api</artifactId>
			<version>3.0.1</version>
		</dependency>
		<dependency>
			<groupId>jstl</groupId>
			<artifactId>jstl</artifactId>
			<version>${jstl.version}</version>
		</dependency>

		<!-- json lib -->
		<dependency>
			<groupId>net.sf.json-lib</groupId>
			<artifactId>json-lib</artifactId>
			<version>2.4</version>
			<classifier>jdk15</classifier>
		</dependency>

		<!-- mysql dependency -->
		<dependency>
			<groupId>mysql</groupId>
			<artifactId>mysql-connector-java</artifactId>
			<version>5.1.9</version>
		</dependency>

		<dependency>
			<groupId>org.slf4j</groupId>
			<artifactId>slf4j-log4j12</artifactId>
			<version>1.7.7</version>
		</dependency>


	</dependencies>
	<build>
		<finalName>spark-solr</finalName>
		<plugins>
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>1.7</source>
					<target>1.7</target>
					<encoding>UTF-8</encoding>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

Now its time for to start with the configurations and let me remind since we are using Spring for as the core framework i will make use of Java configuration capability of Spring. so all my configurations are just java codes with out xml files. let us start with database config first
DBConfig.java

/**
 * 
 */
package com.spark.solr.web.config;

import javax.sql.DataSource;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.DriverManagerDataSource;

/**
 * @author Sony
 *
 */
@Configuration
public class DBConfig {

	@Value("${db.driverClass}")
	String driverClassName;
	@Value("${db.url}")
	String url;
	@Value("${db.username}")
	String username;
	@Value("${db.password}")
	String password;
	
	@Bean
	public DataSource getDataSource(){
		DriverManagerDataSource driverManagerDataSource = new DriverManagerDataSource();
		driverManagerDataSource.setDriverClassName(driverClassName);
		driverManagerDataSource.setUrl(url);
		driverManagerDataSource.setUsername(username);
		driverManagerDataSource.setPassword(password);
		
		return driverManagerDataSource;
	}
	
	@Bean
	public JdbcTemplate getJdbcTemplate(DataSource dataSource){
		return new JdbcTemplate(dataSource);
	}
}

The above code is responsible for injecting the database properties and creating the datasource and finally injecting datasource into jdbc template class.
Now let us write the solr configuration integrated to spring.
SolrConfiguration.java

/**
 * 
 */
package com.spark.solr.web.config;

import org.apache.solr.client.solrj.SolrServer;
import org.apache.solr.client.solrj.impl.HttpSolrServer;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.solr.core.SolrTemplate;
import org.springframework.web.client.RestTemplate;

/**
 * @author Sony
 *
 */
@Configuration
public class SolrConfiguration {

	@Value("${solr.server.host}")
	private String solrUrl;
	
	@Bean
	public SolrTemplate getSolrTemplate(SolrServer server){
		SolrTemplate solrTemplate = new SolrTemplate(server);
		return solrTemplate;
	}
	
	@Bean
	public SolrServer getSolrServer(){
		System.out.println("--------->"+solrUrl);
		SolrServer solrServer = new HttpSolrServer(solrUrl);
		return solrServer;
	}
	
}

Above class is responsible for creating the Solr Server object and injecting that to SolrTemplate class, using which we can perform operation over solr server. now to make our project web compatible lets code WebApplication initializer as below which acts similar to web.xml, this works only with servlet 3.0 container (tomcat7 and above)

/**
 * 
 */
package com.spark.solr.web.config;

import javax.servlet.ServletContext;
import javax.servlet.ServletException;
import javax.servlet.ServletRegistration.Dynamic;

import org.springframework.web.WebApplicationInitializer;
import org.springframework.web.context.support.AnnotationConfigWebApplicationContext;
import org.springframework.web.servlet.DispatcherServlet;

/**
 * @author Sony
 *
 */
public class WebAppInitializer implements WebApplicationInitializer{

	public void onStartup(ServletContext servletContext)
			throws ServletException {
		
		AnnotationConfigWebApplicationContext annotationConfigWebApplicationContext = new AnnotationConfigWebApplicationContext();
		annotationConfigWebApplicationContext.register(WebApplicationConfig.class);
		annotationConfigWebApplicationContext.setServletContext(servletContext);
		
		Dynamic dynamic = servletContext.addServlet("dispatcher", new DispatcherServlet(annotationConfigWebApplicationContext));
		dynamic.addMapping("/solr/*");
		dynamic.setLoadOnStartup(1);
		
		
	}

}

Below is the global config file to read the property files and to load other config files also the below code is responsible for detecting different annotations.

/** 
 * 
 */
package com.spark.solr.web.config;

import org.springframework.beans.factory.config.PropertyPlaceholderConfigurer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.core.io.ClassPathResource;
import org.springframework.web.servlet.config.annotation.EnableWebMvc;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurerAdapter;

/**
 * @author Sony
 *
 */
@Configuration
@EnableWebMvc
@ComponentScan(basePackages = "com.spark.solr")
@Import({ SolrConfiguration.class, DBConfig.class })
public class WebApplicationConfig extends WebMvcConfigurerAdapter {

	@Bean
	public PropertyPlaceholderConfigurer getPropertyPlaceholderConfigurer() {
		PropertyPlaceholderConfigurer placeholderConfigurer = new PropertyPlaceholderConfigurer();
		placeholderConfigurer.setLocation(new ClassPathResource(
				"application.properties"));
		placeholderConfigurer.setIgnoreUnresolvablePlaceholders(true);
		return placeholderConfigurer;
	}

}

Now the code concept of spring solr come here, Spring Data Solr, part of the larger Spring Data family, provides easy configuration and access to Apache Solr Search Server from Spring applications. It offers both low-level and high-level abstractions for interacting with the store.

Derived queries and annotations for Solr specific functionallity allow easy integration into existing applications. So spring data has provided Repositories to deal with different operations of Spring.

/**
 * 
 */
package com.spark.solr.web.repository;

import org.springframework.data.solr.repository.SolrCrudRepository;

import com.spark.solr.web.model.Entity;

/**
 * @author Sony
 *
 */
public interface EntityRepository extends SolrCrudRepository<Entity, Integer>{

}

Now let us start with our service layer and its implementation followed by DAO and its Implementation classes.
EntityService.java

/**
 * 
 */
package com.spark.solr.web.service;

import java.util.List;

/**
 * @author Sony
 *
 */
public interface EntityService {
	public void saveDocument();
	public void saveDocument(Object object);
	public void saveDocument(List<Object> objects);
}

EntityServiceImpl.java

/**
 * 
 */
package com.spark.solr.web.service.impl;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.solr.core.SolrTemplate;
import org.springframework.data.solr.repository.support.SolrRepositoryFactory;
import org.springframework.stereotype.Service;

import com.spark.solr.web.dao.EntityDAO;
import com.spark.solr.web.model.Entity;
import com.spark.solr.web.repository.EntityRepository;
import com.spark.solr.web.service.EntityService;

/**
 * @author Sony
 *
 */
@Service(value = "entityServiceImpl")
public class EntityServiceImpl implements EntityService {

	@Autowired
	SolrTemplate solrTemplate;

	@Autowired
	EntityDAO entityDao;

	EntityRepository entityRepository;
	
	/* (non-Javadoc)
	 * @see com.spark.solr.web.service.EntityService#saveDocument(java.lang.Object)
	 */
	@Override
	public void saveDocument(Object object) {
		// TODO Auto-generated method stub

	}

	/* (non-Javadoc)
	 * @see com.spark.solr.web.service.EntityService#saveDocument(java.util.List)
	 */
	public void saveDocument(List<Object> objects) {
		// TODO Auto-generated method stub
	}
	
	/* (non-Javadoc)
	 * @see com.spark.solr.web.service.EntityService#saveDocument()
	 */
	@Override
	public void saveDocument() {
		getLogger().info("------------- Before fetching the details from database-----------");
		List<Map<String, Object>> values = entityDao.getDataFromDB();
		List<Entity> entities = new ArrayList<Entity>();
		
		for (Map<String, Object> map : values) {
			
			Entity entity = new Entity();
			
			entity.setEmployeeId(Integer.valueOf(map.get("EmployeeID").toString()));
			entity.setLastName(map.get("LastName").toString());
			entity.setFirstName(map.get("FirstName").toString());
			entity.setTitle(map.get("Title").toString());
			entity.setTitleOfCourtesy(map.get("TitleOfCourtesy").toString());
			entities.add(entity);
		}
		getLogger().info("------------- Before pushing documents to Solr server -----------");
		for (Entity entity : entities) {
			getEntityRepository().save(entity);
		}
		getLogger().info("------------- documents push to solr server done ! -----------");
	}
	
	/**
	 * @return
	 */
	public EntityRepository getEntityRepository(){
		return new SolrRepositoryFactory(solrTemplate.getSolrServer()).getRepository(EntityRepository.class);
	}

	public Logger getLogger(){
		return LoggerFactory.getLogger(EntityServiceImpl.class);
	}

}

EntityDAO.java

package com.spark.solr.web.dao;

import java.util.List;
import java.util.Map;


public interface EntityDAO {

	public List<Map<String, Object>> getDataFromDB();
}

EntityDAOImpl.java

/**
 * 
 */
package com.spark.solr.web.dao.impl;

import java.util.List;
import java.util.Map;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

import com.spark.solr.web.dao.EntityDAO;
import com.spark.solr.web.db.DBQueries;
import com.spark.solr.web.service.impl.EntityServiceImpl;

/**
 * @author Sony
 *
 */
@Repository(value="entityDao")
public class EntityDAOImpl implements EntityDAO {
	
	@Autowired
	private JdbcTemplate jdbcTemplate;

	/* (non-Javadoc)
	 * @see com.spark.solr.web.dao.EntityDAO#getDataFromDB()
	 */
	@Override
	public List<Map<String, Object>> getDataFromDB() {
		getLogger().info("----------- Querying database for the records -------------");
		getLogger().info("----------- sql query : "+DBQueries.GET_ALL_EMPLOYEES);
		List<Map<String, Object>> values = jdbcTemplate.queryForList(DBQueries.GET_ALL_EMPLOYEES);
		return values;
	}
	
	public Logger getLogger(){
		return LoggerFactory.getLogger(EntityDAOImpl.class);
	}

}

Now for making a better abstraction between the dao layer and the quires that are used in that layer i implemented a separate utility class as below.

/**
 * 
 */
package com.spark.solr.web.db;

/**
 * @author Sony
 *
 */
public class DBQueries {

	public static final String GET_ALL_EMPLOYEES = "select * from employee";
	
}

Finally our entity model looks as below.

/**
 * 
 */
package com.spark.solr.web.model;

import org.apache.solr.client.solrj.beans.Field;
import org.springframework.data.annotation.Id;

/**
 * @author Sony
 *
 */
public class Entity {

	@Id
	@Field(value="id")
	private int employeeId;
	@Field(value="lastName")
	private String lastName;
	@Field(value="firstName")
	private String firstName;
	@Field(value="title")
	private String title;
	@Field(value="titleOfCourtesy")
	private String titleOfCourtesy;

	/**
	 * @return the employeeId
	 */
	public int getEmployeeId() {
		return employeeId;
	}

	/**
	 * @param employeeId
	 *            the employeeId to set
	 */
	public void setEmployeeId(int employeeId) {
		this.employeeId = employeeId;
	}

	/**
	 * @return the lastName
	 */
	public String getLastName() {
		return lastName;
	}

	/**
	 * @param lastName
	 *            the lastName to set
	 */
	public void setLastName(String lastName) {
		this.lastName = lastName;
	}

	/**
	 * @return the firstName
	 */
	public String getFirstName() {
		return firstName;
	}

	/**
	 * @param firstName
	 *            the firstName to set
	 */
	public void setFirstName(String firstName) {
		this.firstName = firstName;
	}

	/**
	 * @return the title
	 */
	public String getTitle() {
		return title;
	}

	/**
	 * @param title
	 *            the title to set
	 */
	public void setTitle(String title) {
		this.title = title;
	}

	/**
	 * @return the titleOfCourtesy
	 */
	public String getTitleOfCourtesy() {
		return titleOfCourtesy;
	}

	/**
	 * @param titleOfCourtesy
	 *            the titleOfCourtesy to set
	 */
	public void setTitleOfCourtesy(String titleOfCourtesy) {
		this.titleOfCourtesy = titleOfCourtesy;
	}

}

Now let us look at the resource files that i used in this project 1.application.properties and 2.log4j.properties
please put the below files in “scr/main/resources” folder of your maven project.
application.properties

solr.server.host=http://localhost:8983/solr

db.driverClass=com.mysql.jdbc.Driver
db.url=jdbc:mysql://localhost:3306/test
db.username=root
db.password=root

log4j.properties

log4j.rootLogger = INFO, rollingFile

log4j.appender.rollingFile=org.apache.log4j.RollingFileAppender
log4j.appender.rollingFile.File=D:/spring-solr-dev.log
log4j.appender.rollingFile.MaxFileSize=2MB
log4j.appender.rollingFile.MaxBackupIndex=2
log4j.appender.rollingFile.layout = org.apache.log4j.PatternLayout
log4j.appender.rollingFile.layout.ConversionPattern=%p %t %c - %m%n

Note : please find the log file in d drive as i have given in log4j.properties file. please chage the drive location if you need.

Now we have done with the coding part, it is time for deployment, from cmd please navigate to location of project and issue the command as mvn clean install package this should build the project with all the required dependencies and final artifact would be spark-solr.war file.
now deploy this war file in apache tomcat under webapps directory and put the below url in your browser.
http://localhost:8080/spark-solr/solr/controller/load

Once we hit the url the controller calls the service layer and from service layer invoke dao layer to ftch the data from database abd returned back to service layer now in service layer we have the object od Solr repository and using that we push the data to solr.

After tha data is pushed we cah check tha data in solr using the solr admin page. as below
SolrAdmin

using my previous post we can pull the data from solr and show on custom UI.
Happy coding 🙂 happy Solr 🙂