Wednesday, 29 August 2018

Git Workflow - Sample and Git Command Reference - Basics

Sample git workflow:

1) Checkout a new branch from "dev" named after the story you will work on. Branch should be named "feature/{storyNumber}-{story-name}"
2) Perform code changes in small commits. Refer to related story and tasks in the commit messages.
3) When finished (and locally tested), pull and merge again from branch "dev" into your feature branch. Fix potential conflicts locally.
4) After making sure all is good to go, let team members know during the next stand-up. Push your branch and submit a pull request so that your branch gets merged into "dev" branch again.

At later stage, dev branch will be merged into master and deployed.



git add . // to add newly created / modified files

git commit -m < inline comment>

git pull origin <branch_name>

git push origin <branch_name>





git status -> Shows the current status of the branch you are working.




git checkout <fileName> will undone the local changes

git checkout -f ---> force checkout all the modification



git stash ----> copies your changes into seperate location

git statsh list --> will show you all the changes 

git commit -a -m "message" - single command to add, commit and add message


to Create a branch

  - git branch <branch_name>
 
  to push local branch to remote
 
  - git push -u origin <branch_name>
 
 
  U can also use git flow to create a branch
 
   - git flow init
 
      Command to create a brnach using git flow
      git flow feature start <name of the branch>
 
  git flow feature publish // to publish the branch to remote
 


  to force all the changes to undone -> git checkout -f
 







To merge a Dev branch to master branch

- Verify you are in dev branch

1) git checkout master

2)git merge dev

3) git push

4) git chekout dev( In order to be back to dev branch)

Monday, 27 August 2018

JUnit test case sample code Snippet to Perform save/ persist using JPA

Following is the JUnit  sample code snippet to perform save/ persist on the database using JPA

 @Test
    public void addApplication() throws Exception {

       // Constructing mock Json for post operation
        String mockApplicationJson = "{\"name\":\"Test Application\",\"description\":\"A test application.\",\"owner\":\"Kotresh Matada\"}";

        //Create a post request with an accept header for application\json
        RequestBuilder requestBuilder = MockMvcRequestBuilders
                .post("/tza/application/")
                .accept(MediaType.APPLICATION_JSON).content(mockApplicationJson)
                .contentType(MediaType.APPLICATION_JSON);

        MvcResult result = mockMvc.perform(requestBuilder).andReturn();

        MockHttpServletResponse response = result.getResponse();

        //Assert that the return status is CREATED
        assertEquals(HttpStatus.CREATED.value(), response.getStatus());

        //Get the location from response header and assert that it contains the URI of the created resource
        assertEquals("http://localhost/application/1",
                response.getHeader(HttpHeaders.LOCATION));
    }

Sunday, 26 August 2018

Aspect Oriented Programming(AOP) - Spring - Basics

Aspects
 - Reusable blocks of code that are injected into your application at runtime
 - Powerful tools for adding behavior
 - Solve cross-cutting concerns in one place

Common Applications of Aspects
  - Logging
  - Transaction management
  - Caching
  - Security

What is a Cross-Cutting Concern?

   - Evaluate business requirements and look for words like every or always.
   - Look for system requiremnts that apply to multiple business requirement.

 When thinking about the term cross-cutting concern we need to look at what it really means.

The first place that we want to look at is a set of business requirements that have words like every or always. These tend to indicate that we have a simple set of requirements that applies to many different use cases. Another place, and one that I find more often than not can be solved with Aspecting, is system level requirements. If we have a system level requirement that says any time a user logs in, log all of the actions that they perform, or things like every time we execute a database method, we want specific logging written out to our system logs.

These are cross-cutting concerns that apply to system requirements and these are great places to solve that concern with an Aspect. Now, I just talked a little bit about that logging routine, but let's talk a little bit more about why that logging routine becomes so important. If we were to write a logging routine that executes every time data is accessed from the database, we would have to copy and paste that essential block of code every single time we need it.

That violates the principle of don't repeat yourself. So, using Aspects removes that code duplication. Another problem with using this sort of duplicated code is that now we're mixing concerns of our application method. If we have a method whose sole purpose is to go to the database and load the customer from the database, and then we add into it the security concerns and the logging concerns, we're now mixing into our database access method other concerns that don't apply to database access itself.

Why we need Aspects?

 - Imagine a logging routine that applies to every service method.
 - To avoid Code duplication - don't repeat yourself(DRY)
 - Mixing of concerns is not best practise
 - This helps to maintain application logic

Spring Aspects
  - Leverages AspectJ for aspecting
  - Byte code modification (run time interweaving)
  - Dynamic proxy based

Parts of a Spring Aspect
  -Join Point :  A join point is a point in code in the program where execution of an Aspect is targeted towards. So this is your method, or your line of code or your annotation that the Aspect is going to target.
  - Point Cut : The pointcut is the expression that identifies that join point through some sort of regular expression matching.
   - Advice : The advice is the code that you actually execute at a join point that was selected by a pointcut. So the advice is your cross-cutting concern routine that we are applying to a join point in our application.
   - Aspect : An Aspect is a module that contains all of your pointcuts, as well as all of your advice that is then injected at the run time of your application.


Define AOP pointcuts
  Pointcut Syntax : designator("r p.c.m(arg))
  r - return type
  p - package
  c - class
  m - method
  arg - 0 or more arguments

Common Designators
   execution : Expression for matching method execution
                The execution pointcut is one of the most common. You could write a pointcut that evaluates method names or patterns to execute your advice. For instance, in the data package all methods that start with get and any number of args execute the advice
 
   within : expression for matching within certain types
            You could use the within designator to specify that you want your advice applied to all types within a certain package, for instance. Say, com.google.common.service.

  target : expressions for matching a specific type
           The type designator allows you to say: apply some advice to a specific type, say, the customer service class

  @annotation : expressions for matching a specific annotation

Saturday, 25 August 2018

Spring Life Cycle

First of all Thank you to Frank P Moley(https://www.lynda.com/Frank-P-Moley/) for his explanation on Spring Life cycle. I am trying to document the information for my reference and whosoever it helps

Spring Life Cycle - has following 3 phases
  1)Initialization
  2)Use phase
  3)Destruction phase

There are three distinct phases of a Spring lifecycle. The first is the initialization phase. Most of the work that we do within the Spring framework, through the configuration of the application context directly impacts the initialization phase.

The use phase is actually the single largest point at which your application spends within the lifecycle itself. For a typical application, over 99% of its time is actually in the use phase of the Spring lifecycle. It really is a very simple case of IOC. The destruction phase called when u invoke close on the application context.

Initialization Phase :

  - Begins with creation of Application Context
   -BeanFactory initialization phase
   -Bean initialization and instatiation

The initialization phase of a Spring Bean application's lifecycle begins with the creation of the ApplicationContext.When you're running a web application the ApplicationContext is actually started as part of the Servelet initialization phase itself. The Servelet calls in and starts the ApplicationContext and indeed the ServeletContext is what actually has the handle to the ApplicationContext. The first part of the initialization phase, after the ApplicationContext starts is the BeanFactory initialization phase.

But know that the ApplicationContext, again, is the wrapper for the BeanFactory, but the BeanFactory itself has to be brought up correctly. And that is done during the BeanFactory initialization phase of the Spring Bean lifecycle. After the BeanFactory is primed and ready to run we'll get into individual bean initialization and instantiation operations. There's all kinds of places here that we can impact the behavior of an individual bean and during the initialization phase most if not all of those are set.


The first thing that happens, is as I mentioned, the BeanFactory is loaded. And that BeanFactory initialization occurs in two distinct points.

The bean definitions themselves are loaded into the BeanFactory through their metadata and then there's a process by which we actually do post-processing on the bean definitions themselves, called BeanFactory post-processing. Once the BeanFactory itself is initialized and ready to be used, then we iterate through all of the beans in that factory and go through a process that occurs on each individual bean. Each bean is instantiated, set with appropriate setters, bean post-processors are then executed that happen before the initializer.

The initializer itself occurs. And then bean post-processors that happen after the initializer.  But once this whole lifecycle is executed and ready to run, we then proceed into the use phase of our application.

Initialization phase again divided as follows

1) Loading Bean definitions: 

  •  The bean definitionas are loaded into the BeanFactory from  all sources i.e. Java configuration, XMl configuration and Component scanning and autoconfiguration.
  •  Id is used to create the index for the bean in the bean factory.
  •  Bean factory at this point only contains reference and no class has been created yet.

2)Init : Post-Process Bean Definitions/ Bean Factory Post - Processors

    - Perform work on the entire BeanFactory
    - Can modify or transform any bean in the factory prior to instantiation
    - Most familar example is the "ProprtySourcesPlaceholderConfigurer" :-  
      The PropertySourcesPlaceholderConfigurer takes property files, parces   them, and injects the     property  values into the bean before it's ever instantiated

 3)Bean Instantiation

   - Beans are instantiated in the factory using constructors.
   - Done in the correct order to ensure dependencies are created first.
   - Handle to class instance remains in the bean factory for the lifecycle  of the application for singletons
   - By default, all beans are instatiated eagerly
   - In order for a bean to be lazily instantiated, it not only needs to be annotated as lazy, but it also has to have nothing else in the bean  factory that uses them as a dependency. If a bean is a dependency of      another bean, it cannot be lazily instantiated.
   - Lazy beans can be specified as lazy, but ApplicationContext reserves the  right to ignore
   - When this phase is completed, Bean pointer is still referenced in  Beanfactory and Objects have been constructed but they are not available  for use yet.

4)Setters

 - Once all of the beans have been instantiated and all of the required dependencies injected with those constructor-level injections, we now have fully qualified classes ready to be used. However, Spring then goes through each of those classes and modifies them through the setter argument.
- Setters are called
- Autowiring occures(non constructor based).
- Once this phase completes, beans are fully initialized and all the dependencies are injected but beans still not ready for use.

5)Bean Post-Processing

  - Final point of configuration manipulation
  - Each bean may have additional behaviors added
  - Two types of extensible and generic processing : before and after initializer
  - The methods annotated with @PostConstruct methods called here
  - There also is a BeanPostProcess Interface and this interface allows you to inject common behavior into a specific bean or a class of beans. Now, it's important to note that, even if you're dealing with a class of beans, you're still operating on each bean individually. There are two types of methods that you can have before initializer and after initializer, also known as pre-init and post-init methods. The framework itself also leverages a lot of these BeanPostProcessors.

- There also is a BeanPostProcess Interface and this interface allows you to inject common behavior into a specific bean or a class of beans. Now, it's important to note that, even if you're dealing with a class of beans, you're still operating on each bean individually. There are two types of methods that you can have before initializer and after initializer, also known as pre-init and post-init methods. The framework itself also leverages a lot of these BeanPostProcessors.

-When this phase is complete, you will have beans that are fully instantiated and initialized. All of the dependencies will be injected and all behavior will be added to each and every bean. At this point, all of our beans are actually ready to use.


The Initialization phase : Differences based on configuration types
 
  Java Configuration
   - Instantiation and setters are merged
   - Each method with @Bean annotation is executed in proper order
  
  AutoConfiguration
   - Instantiation of all beans scanned
   - @Autowired constructors
   - Then during setter injection, Autowired setters anf fields
    
  XML Configuration
    - Instatiation of all beans and constructor arg injection
    - Property injection


Use Phase

 - Most of the time is spend in this phase
 - ApplicationContext serves proxies to the original class
 - ApplicationContext maintains handle to each bean(Singleton)
 - Spring provides interface ApplicationContextAware - gives your class a handle to the ApplicationContext and this is not a very common interface to use, but is available during the use phase


Destruction Phase

 -The destruction phase in a Spring application begins when close is called on the application context. Now, this can either be a manual call to close or when the framework in which the application itself is running calls close. As the application context itself is starting to close, any method that is annotated with @PreDestroy or the corresponding XML notation for a PreDestroy method, that method itself is executed at that point.
 - Beans are not destroyed at this point. An important thing to note is that the beans themselves are not destroyed at this point. In a Java world, the only thing that can destroy a class itself is the garbage collector so calling close on the application context makes every bean contained within it go out of scope and allow it to be garbage collected during the normal processing.

When close is called, the application context goes out of scope and all handles to it are released. Prototype beans are not impacted by the destruction phase either. Because a prototype bean no longer is handled by the application context once it's constructed, they go out of scope immediately when the application class itself no longer needs them. And as I previously mentioned, garbage collection is the only thing that can destroy an instance of a bean.

Wednesday, 22 August 2018

Java Api Design/ Coding Standards Resources Or blog informations

Here I am trying to document/ create a repository of useful blogs about java coding standards or good API design with Java


1) What I learned from doing 1000 code reviews  by Steven Heidel

https://stevenheidel.medium.com/what-i-learned-from-doing-1000-code-reviews-fe28d4d11c71


2) API Design with Java 8 by Per-Ã…ke Minborg

dzone article

Basics of Spring Bean Scopes

Bean Scopes

1) Singleton
  - The default scope of every bean is singleton.
  - One instance per context definition
  - Be careful with static data

2) Prototype
  - New instance every time it is referenced
  - Definitions is stored in factory, instances are not
  - Very useful for transient data or types that flex based on application
    state

  3)Session
   - Applies to web environment only
   - One instance of bean per user session
   - Definition stored in bean factory, instance is not

  4)Request
    - Applies to web environment only
    - One instance per request
    - Definition stored in bean factory, instance is not

Below mentioned example shows configuring bean named "Worker" to prototype bean

        @Bean
@Scope("prototype")
public Worker worker(){
return new Worker(greetingText);
}

Tuesday, 21 August 2018

Spring Introduction

Spring
 
     - It is an open source framework for writing Java applications for both Enterprise and internet application development
 
      - Lightweight framework : you only need to bring onto the class path those jars that you're actually going to leverage.

      - No need of heavy application Server: there is no need to run your Java application on a heavy application server. You don't need web logic or web sphere to run a Spring application because Tom, Cat or Jetty will provide everything that you need.


Inversion Of Control

       - Container maintains your class dependencies.
       - Objects injected at run time, not compile time.
       - An object accepts all of the dependencies for construction instead of constructing them itself. Now in doing this, you actually manage all of the construction of dependency objects in a single point of failure. This reduces the amount of code that you have to write, but it also reduces the replication of code.

Advantages of IOC

     - Reduction of noise in your code :  When dealing with dependency injection, because I'm not copy and pasting that construction code over and over, my code can focus on the business logic and not all of this construction noise.
     - Reduces object coupling : Since an object that I am creating doesn't need to know how to create all of its dependencies, that coupling is dramatically reduced.


Application Context

 The Application Context is the heart of an application leveraging the Spring Framework.

Application Context is a read-only wrapper of  BeanFactory, and all of your run time interactions with the BeanFactory or any beans contained in that factory is through this Application Context. The Application Context provides the metadata for all beans created, and also provides a mechanism for creating beans in the correct order. The Application Context is the Inversion of Control Container and all of your Dependency Injections occur here.

    - Provides all facilities for injection of beans at startup and run time.
    - Most of utilizing Spring is actually configuring the IoC container.
    - Application context handles all singleton beans.
    - A Spring application can have one or more ApplicationContexts
    - Web containers always have multiple
     - Parent context can interact with children, but not the other way around
   
Major Benefits of  Java Configuration
      
      - Native language syntax.
      - Compile time checking of configuration.
      - Easier IDE integration.

- Java configuration stems from a class annotated with @Configuration.
- Beans are created as methods of the class, where the method is annotated   with @Bean
- Constant beans can be defined with @Value


To import multiple configurations into a single config file.

Step1 : Import other config class using @Import annotation . Ex: @Import(DataConfig.class)
Step2 : Autowire the Dependencies using @Autowired annotation
           Ex:
                 @Autowired
private CustomerRepository customerRepository;

@Autowired
private SalesOrderRepository salesOrderRepository;

  Bean definitions of SalesOrderRepository  and CustomerRepository  are defined in DataConfig.java configuration class.


Monday, 13 August 2018

Apache Maven Build Tool Learnings - Java

How to seperate unit and integration test?

- maven-surefire-plugin : designed to handle unit test(Junit)
- maven-failsafe-plugin : designed to handle integration tests


           Reference Blog for complete information
           How to separate integration test from unit test




Following is the process to run Unit and Integration Tests separately

1) Under properties section of the pom.xml file, define following properties
<skipTests>false</skipTests>
<skipITs>${skipTests}</skipITs>
<skipUTs>${skipTests}</skipUTs>

2) Under build section add maven-failsafe-plugin(for integration tests) and maven-surefire-plugin(for unit tests) as follows

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.13</version>
<configuration>

<skipTests>${skipTests}</skipTests>
<skipITs>${skipITs}</skipITs>
</configuration>
<executions>
<execution>
<id>failsafe-integration-tests</id>
<phase>integration-test</phase>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<trimStackTrace>false</trimStackTrace>
<skipTests>${skipUTs}</skipTests>
</configuration>
</plugin>

3) Below are the commands to skip/run the tests seperately

i) mvn install -DskipUTs : Skips Unit tests
ii)mvn install -DskipITs : Skips Integration tests
iii)mvn install -DskipTests : Skips both Unit and Integration Tests

4) We need to name the integration test as per maven-failsafe-plugin naming convensions for example, *IT.java




Following are the steps to skip tests by default but want the ability to re-enable tests from the command line in surefire plugin.

1) In the Properties section of the pom.xml need to add a property

<skipTests>true</skipTests>

2) Modify maven-surefire-plugin the plugin under build section as follows

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-surefire-plugin</artifactId>

<configuration>

<trimStackTrace>false</trimStackTrace>

<includes>

<include>**/*Test.java</include>

</includes>

<excludes>

<exclude>**/*IntegrationTest.java</exclude>

<exclude>**/*IT.java</exclude>

</excludes>

<skipTests>${skipTests}</skipTests>

</configuration>

</plugin>

3) When we run , mvn clean install or mvn test, unit tests will not run by default

4) We can use "mvn install -DskipTests=false" to run the tests from the command line

Saturday, 11 August 2018

Read values from properties file in maven project- Java

If the file is placed under target/classes after compiling, then it is already in a directory that is part of the build path. The directory src/main/resources is the Maven default directory for such resources, and it is automatically placed to the build path by the Eclipse Maven plugin (M2E). So, there is no need to move your properties file.
The other topic is, how to retrieve such resources. Resources in the build path are automatically in the class path of the running Java program. Considering this, you should always load such resources with a class loader.

References : Stackoverflow


Example code:
String resourceName = "myconf.properties"; // could also be a constant
ClassLoader loader = Thread.currentThread().getContextClassLoader();
Properties props = new Properties();
try(InputStream resourceStream = loader.getResourceAsStream(resourceName)) {
    props.load(resourceStream);
}
// use props here ...

Wednesday, 8 August 2018

Docker Containers v/s Virtual Machines

Docker

  - Application Delivery Technology
  -  Build an application with a Docker Image
  - Ship an application with Docker Hub
  -  Run an application with Docker Container
  - Avoid single point of failure





Docker Compose
   Defining and Running multi-container Applications.
   - Configuration defined in one or more files
   docker-compose.yml(default)
   docker-compose.override.yml(default)
   Multiple files specified using -f
   All paths relative to base configuration file

- Great for dev, staging, and CI

 Docker Swarm
  Native clustering for Docker
  Provides a unified interface to a pool of Docker hosts
  Fully integrated with Machine and Compose
  Serves the standard Docker API
  1.2 - Ready for Production
    - Reschedule containers when a node fails
    - Better node management


Differences between docker containers and virtual machines can be measured based on the operating system support, security, portability, and performance

Below diagram shows details information




References : https://www.docker.com/captains/arun-gupta