Wednesday, April 3, 2019

Spring Boot ibatis Batch insert with Annotations

Spring Boot framework, which has changed the way Java developers use Spring framework for writing Java applications. And also ibatis is powerful lightweight ORM framework. Which i use most of the time.

Lets see how to create ibatis Batch insert with annotations

gradle build file

dependencies {
    compile 'org.json:json:20131018'
    compile 'org.glassfish.jersey.core:jersey-client:2.27'
    compile 'org.glassfish.jersey.inject:jersey-hk2:2.27'
    implementation 'org.mybatis.spring.boot:mybatis-spring-boot-starter:2.0.0'
    compileOnly 'org.projectlombok:lombok'
    runtimeOnly 'mysql:mysql-connector-java'
    annotationProcessor 'org.projectlombok:lombok'
    testImplementation 'org.springframework.boot:spring-boot-starter-test'
}

Mapper Class

package com.scheduler.mapper;

import com.scheduler.model.Resource;
import org.apache.ibatis.annotations.Insert;
import org.apache.ibatis.annotations.Mapper;
import org.apache.ibatis.annotations.Param;
import org.apache.ibatis.annotations.Select;
import java.util.List;

@Mapper
public interface ResourceMapper {

    @Select("select * from resource where account_id = #{accountId}")
    List<Resource> findAll(@Param("accountId") long accountId);

    @Insert("INSERT IGNORE INTO resource(id, account_id, region_id) VALUES(#{id},#{accountId}, #{regionId})")
    void insert(Resource resource);

}


Dao Class

package com.scheduler.dao;
import com.scheduler.mapper.ResourceMapper;
import com.scheduler.model.Resource;
import lombok.extern.java.Log;
import org.apache.ibatis.session.ExecutorType;
import org.apache.ibatis.session.SqlSession;
import org.apache.ibatis.session.SqlSessionFactory;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.Transactional;

import java.util.List;
import java.util.logging.Level;
@Log
@Repository
public class ResourceDao {
    private ResourceMapper resourceMapper;
    private SqlSessionFactory sqlSessionFactory;
    public ResourceDao(ResourceMapper resourceMapper, SqlSessionFactory sqlSessionFactory) {
        this.resourceMapper = resourceMapper;
        this.sqlSessionFactory = sqlSessionFactory;
    }
    public List<Resource> findAll(Long accountId) {
        return this.resourceMapper.findAll(accountId);
    }
    @Transactional()
    public void insert(List<Resource> newResource) {
        try {
            try (SqlSession sqlSession = this.sqlSessionFactory.openSession(ExecutorType.BATCH)) {
                ResourceMapper batchMapper = sqlSession.getMapper(ResourceMapper.class);
                for (Resource resource : newResource) {
                    batchMapper.insert(resource);
                }
                sqlSession.commit();
            }
        } catch (Exception e) {
            log.log(Level.WARNING, "error occurs while adding data", e);
        }

    }

}


By default ibatis uses SimpleExecutor (ExecutorType.SIMPLE) but if you need to do batch insert need to use BatchExecutor which we are telling to the ibatis by this.sqlSessionFactory.openSession(ExecutorType.BATCH))

Friday, February 12, 2016

Too many open files

If you are working with application runs on Linux based OS that involves in lots of I/O operations you may have encountered this error."Too many open files (24)"

What is This error
In Linux based Os's there are resource limits are specified for user/process to ensure fair usage of resources and for security reasons.If a resource usage of  of user/process try exceed the specified limit it was prevented by the OS.

How to see this limits
By using ulimit command we can examine this parameters at the global level.

[user@localhost ~]$ ulimit -a
core file size           (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 7281
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 4096
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
to see this details in a more specified way we can use /proc folder with relevant process id.
[user@localhost ~]$ sudo cat /proc/989/limits
Limit                     Soft Limit           Hard Limit           Units
Max cpu time              unlimited            unlimited            seconds
Max file size             unlimited            unlimited            bytes
Max data size             unlimited            unlimited            bytes
Max stack size            8388608              unlimited            bytes
Max core file size        0                    unlimited            bytes
Max resident set          unlimited            unlimited            bytes
Max processes             7281                 7281                 processes
Max open files            1024                 4096                 files
Max locked memory         65536                65536                bytes
Max address space         unlimited            unlimited            bytes
Max file locks            unlimited            unlimited            locks
Max pending signals       7281                 7281                 signals
Max msgqueue size         819200               819200               bytes
Max nice priority         0                    0
Max realtime priority     0                    0
Max realtime timeout      unlimited            unlimited            us
to see file descriptor limit currently used by the process

[user@localhost ~]$ sudo ls -al /proc/1181/fd | wc -l
17

Using lsof command sometimes not provide accurate detail because it count all the files involves with the process even .so files.

How to increase this limits 

In the OS level

Enter value like the following to the /etc/sysctl.conf.(Maximum number can be allowed is 65535 because it is the highest number which can be represented by an unsigned 16-bit binary number) and reload the kernel variables.
[user@localhost ~]$ cat /etc/sysctl.conf
# System default settings live in /usr/lib/sysctl.d/00-system.conf.
# To override those settings, enter new settings here, or in an /etc/sysctl.d/<name>.conf file
#
# For more information, see sysctl.conf(5) and sysctl.d(5).
fs.file-max=20000
[user@localhost ~]$ sudo sysctl -p
[sudo] password for user:
[user@localhost ~]$ sysctl fs.file-max
fs.file-max = 20000
In the user level enter the new configuration to /etc/security/limits.conf file using following format.
user       soft    nofile   10000
user       hard    nofile  8000


Thursday, February 11, 2016

Java trust store and key store

To ensure secure communication over internet we can use Secure Sockets Layer (SSL) which uses Public-key cryptography. Public-key cryptography is based on the concept of a key pair, which consists of a two keys. Data that has been encrypted with a one key can be decrypted only with the corresponding other key.

Keytool Program
The keytool program is a security tool included in the bin directory of the Java SDK. Which can be used to manage public key cryptography key pairs. It can used to manage key databases. Mainly there are two type of databases.

KeyStore 
Which is used when a java program act as a server. That contains both private and corresponding key pairs.

TrustStore
Which contains public keys which is used when it act as client. When a server present it’s key to the client it check can I trust this certificate. To do so it checks is it signed by a Trusted Certificate Authority (which public key of the CA is with its trust store).

Creating a trust store with CA certificates
[user@localhost certificates]$ ls -al
total 8
-rw-rw-r--. 1 user user 1360 Feb 11  2016 DigiCertGlobalRootCA.crt
-rw-rw-r--. 1 user user  806 Apr 29  2009 ThawteServerCA.cer
[user@localhost certificates]$ keytool  -import -file DigiCertGlobalRootCA.crt -alias DigiCertGlobalRootCA -keystore TrustStore
Enter keystore password:
Re-enter new password:
Owner: CN=DigiCert Global Root CA, OU=www.digicert.com, O=DigiCert Inc, C=US
Issuer: CN=DigiCert Global Root CA, OU=www.digicert.com, O=DigiCert Inc, C=US
Serial number: 83be056904246b1a1756ac95991c74a
Valid from: Fri Nov 10 05:30:00 IST 2006 until: Mon Nov 10 05:30:00 IST 2031
Certificate fingerprints:
         MD5:  79:E4:A9:84:0D:7D:3A:96:D7:C0:4F:E2:43:4C:89:2E
         SHA1: A8:98:5D:3A:65:E5:E5:C4:B2:D7:D6:6D:40:C6:DD:2F:B1:9C:54:36
         SHA256: 43:48:A0:E9:44:4C:78:CB:26:5E:05:8D:5E:89:44:B4:D8:4F:96:62:BD:26:DB:25:7F:89:34:A4:43:C7:01:61
         Signature algorithm name: SHA1withRSA
         Version: 3

Extensions:

#1: ObjectId: 2.5.29.35 Criticality=false
AuthorityKeyIdentifier [
KeyIdentifier [
0000: 03 DE 50 35 56 D1 4C BB   66 F0 A3 E2 1B 1B C3 97  ..P5V.L.f.......
0010: B2 3D D1 55                                        .=.U
]
]

#2: ObjectId: 2.5.29.19 Criticality=true
BasicConstraints:[
  CA:true
  PathLen:2147483647
]

#3: ObjectId: 2.5.29.15 Criticality=true
KeyUsage [
  DigitalSignature
  Key_CertSign
  Crl_Sign
]

#4: ObjectId: 2.5.29.14 Criticality=false
SubjectKeyIdentifier [
KeyIdentifier [
0000: 03 DE 50 35 56 D1 4C BB   66 F0 A3 E2 1B 1B C3 97  ..P5V.L.f.......
0010: B2 3D D1 55                                        .=.U
]
]

Trust this certificate? [no]:  yes
Certificate was added to keystore
[user@localhost certificates]$ keytool  -import -file ThawteServerCA.cer -alias ThawteServerCA -keystore TrustStore
Enter keystore password:
Owner: EMAILADDRESS=server-certs@thawte.com, CN=Thawte Server CA, OU=Certification Services Division, O=Thawte Consulting cc, L=Cape Town, ST=Western Cape, C=ZA
Issuer: EMAILADDRESS=server-certs@thawte.com, CN=Thawte Server CA, OU=Certification Services Division, O=Thawte Consulting cc, L=Cape Town, ST=Western Cape, C=ZA
Serial number: 34a4fff630af4ca53c331742a1946675
Valid from: Thu Aug 01 06:30:00 IST 1996 until: Sat Jan 02 05:29:59 IST 2021
Certificate fingerprints:
         MD5:  EE:FE:61:69:65:6E:F8:9C:C6:2A:F4:D7:2B:63:EF:A2
         SHA1: 9F:AD:91:A6:CE:6A:C6:C5:00:47:C4:4E:C9:D4:A5:0D:92:D8:49:79
         SHA256: 87:C6:78:BF:B8:B2:5F:38:F7:E9:7B:33:69:56:BB:CF:14:4B:BA:CA:A5:36:47:E6:1A:23:25:BC:10:55:31:6B
         Signature algorithm name: SHA1withRSA
         Version: 3

Extensions:

#1: ObjectId: 2.5.29.19 Criticality=true
BasicConstraints:[
  CA:true
  PathLen:2147483647
]

Trust this certificate? [no]:  yes
Certificate was added to keystore
[user@localhost certificates]$ keytool  -list -keystore TrustStore
Enter keystore password:

Keystore type: JKS
Keystore provider: SUN

Your keystore contains 2 entries

digicertglobalrootca, Feb 11, 2016, trustedCertEntry,
Certificate fingerprint (SHA1): A8:98:5D:3A:65:E5:E5:C4:B2:D7:D6:6D:40:C6:DD:2F:B1:9C:54:36
thawteserverca, Feb 11, 2016, trustedCertEntry,
Certificate fingerprint (SHA1): 9F:AD:91:A6:CE:6A:C6:C5:00:47:C4:4E:C9:D4:A5:0D:92:D8:49:79
[user@localhost certificates]$

Monday, August 17, 2015

Filter by package and type

This is an attempt to filter the classes in during the run-time. Finally succeeded  with a help of online resources and several attempts.

import java.io.File;
import java.net.URL;
import java.util.ArrayList;
import java.util.List;
import java.util.logging.Level;
import java.util.logging.Logger;

public class ClassCheck {

    private static final Logger LOGGER = Logger.getLogger(ClassCheck.class.getName());
    private static final char DOT = '.';
    private static final char SLASH = '/';
    private static final String CLASS_SUFFIX = ".class";
    private static final String BAD_PACKAGE_ERROR = "Unable to get resources from path '%s'. Are you sure the package '%s' exists?";
    private ClassCheck() {}

    /**
     * @param scannedPackage : Package to analyzed
     * @return list of classes in a package
     */
    public static List<Class<?>> find(String scannedPackage, Class... type) {

        URL scannedUrl = Thread.currentThread().getContextClassLoader().getResource(scannedPackage.replace(DOT, SLASH));
        if (scannedUrl == null) {
            throw new IllegalArgumentException(String.format(BAD_PACKAGE_ERROR, scannedPackage.replace(DOT, SLASH), scannedPackage));
        }
        File scannedDir = new File(scannedUrl.getFile());
        List<Class<?>> classes = new ArrayList<>();
        for (File file : scannedDir.listFiles()) {
            classes.addAll(find(file, scannedPackage, type));
        }
        return classes;
    }

    public static List<Class<?>> find(File file, String scannedPackage, Class... type) {
        List<Class<?>> classes = new ArrayList<>();
        String resource = scannedPackage + DOT + file.getName();
        if (file.isDirectory()) {
            for (File child : file.listFiles()) {
                classes.addAll(find(child, resource, type));
            }
        } else if (resource.endsWith(CLASS_SUFFIX)) {
            int endIndex = resource.length() - CLASS_SUFFIX.length();
            String className = resource.substring(0, endIndex);
            try {
                if (type.length > 0) {
                    for (Class c : type) {
                        Class toCheck = Class.forName(className);
                        if (c!=toCheck && c.isAssignableFrom(toCheck)) {
                            classes.add(toCheck);
                            break;
                        }
                    }
                } else {
                    classes.add(Class.forName(className));
                }
            } catch (ClassNotFoundException e) {
                LOGGER.log(Level.WARNING, "Class not found", e);
            }
        }
        return classes;
    }
}

CHEF Server with jclouds

CHEF solo is a standalone simple application which can be used to manage single instance.But if there is a need to manage multiple instances and want better control over organisations Cookbook's and Data Bags CHEF server will be a good choice .

Environment

CHEF server
Unlicensed version of CHEF server can be downloaded from (link) their site and supported up to 25 nodes.And there also a service provided by CHEF which provide a  access to one of their server by registering to manage.chef.io which is used in this article.

jclouds
When it comes to controlling CHEF server jclouds will be good choice and it also provide number of adapters to communication with other cloud services also.To authenticate to the CHEF server private key is needed

Activity

In this example java client is connected to the CHEF server and  list avilable data bags.

    import com.google.common.base.Charsets;
    import java.io.IOException;
    import java.nio.file.Files;
    import java.nio.file.Paths;
    import java.util.Set;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    import org.jclouds.ContextBuilder;
    import org.jclouds.enterprisechef.EnterpriseChefApi;

    public class Main {

        public static void main(String[] args) {

            String client = "ubuntu";
            String organization = "ubuntu_sl";
            String pemFile = System.getProperty("user.home") + "/.chef/" + client + ".pem";
            String credential = null;
            EnterpriseChefApi chefApi;
            try {
                credential = new String(Files.readAllBytes(Paths.get(pemFile)), Charsets.UTF_8);
            } catch (IOException ex) {
                Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
            }
            chefApi = ContextBuilder.newBuilder("enterprisechef")
                    .endpoint("https://api.opscode.com/organizations/" + organization)
                    .credentials(client, credential).buildApi(EnterpriseChefApi.class);
            Set<String> databags = chefApi.listDatabags();
            for (String databag : databags) {
                System.out.println(" ******************** " + databag);
            }
            try {
                chefApi.close();
            } catch (IOException ex) {
                Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
            }
        }
    }

Sunday, July 26, 2015

Getting started with CHEF

CHEF is a configuration management tool which comes in handy when it comes to managing thousands of software components in production & development environments. It’s similar in functionality to puppet but use more procedural approach to do the job done. This example going to use CHEF solo

CHEF glossary

CHEF cookbook
        More like an artifact which contains all the things need to have to complete a scenario which include Recipes, data bags, templates and dependencies to other cookbook's

CHEF Server
        Which contains all the cookbook's and data bags which act as central management hub

CHEF Client
        Act as the agent of the CHEF server for each node

WorkStation
        Where user can communicate with CHEF server

Data Bag
        JSON variable which used to store data

Environment setup

In this example going to use plain Ubuntu server 14.04.2. And to install CHEF solo
        root@ubuntu2:~# sudo apt-get install chef

Configuring CHEF solo
Default location for the chef solo is located in /etc/chef/solo.rb
    CHEF solo.rb
        checksum_path "/var/chef/checksums"
        cookbook_path [
            "/var/chef/cookbooks",
            "/var/chef/site-cookbooks"
        ]
        data_bag_path repo "/var/chef/data_bags"
        environment_path "/var/chef/environments"
        file_backup_path "/var/chef/backup"
        file_cache_path "/var/chef/cache"
        role_path "/var/chef/roles"
        log_level :debug
        log_location "/var/chef/logs/chef.log"

Creating a cookbook
Go to site-cookbooks folder and create directory called example and file structure under it

        root@ubuntu2:/var/chef/site-cookbooks/example# tree
            .
            +-- recipes
            ¦   +-- default.rb
            +-- templates
                +-- default
                    +-- apache.conf.erb

Now need to enter execution instruction for our cookbook which is going to create configuration file using template and a data bag
        default.rb          
        item = data_bag_item("config", data_bag("config").last)
            template '/tmp/apache.conf' do
            source 'apache.conf.erb'
            owner 'root'
            group 'root'
            mode '644'
            variables(:config=> item)
        end

Then need to create template file for the recipe
        apache.conf.erb
        <VirtualHost <%= @config['ip']%>:<%= @config['port']%>>
            DocumentRoot /www/<%= @config['folder']%>
            ServerName <%= @config['hostName']%>
        </VirtualHost>

In the data bag directory create a file config and two json files under it
        root@ubuntu2:/var/chef/data_bags# tree
            .
            +-- config
                +-- config1.json
                +-- config2.json
      
        ex :
            {
              "id": "config1",
              "ip": "10.10.10.1",
              "port": "8881",
              "folder": "sample1",
              "hostName": "host1"
            }

Now we can run our cookbook typing chef-solo -o 'recipe[example]'