Wednesday, March 25, 2015

Understanding folder structure of a custom extension in WSO2 Enterprise Store

Many find it difficult to identify the caramel framework of ES (Enterprise Store) . Therefore you could go through this simple guide to understand the basic structure of folders you need to know, to add your own custom extension to ES. 

Under ES install dir/repository/deployment/server/jaggeryapps , you can find two folders named publisher and store. This is the starting point for your custom extension, where you need to decide whether you need to add the extension under publisher/store or both.  Say you need to add a new extension named 'et1' under publisher, you need to go to publisher/ extensions/assets and create a new folder named et1.

Next inside your et1 folder you need to have a folder structure as follows.

Any .jag apis you create need to go into 'apis' folder, and you can list those apis in asset.js. The section where most find it confusing is themes, so we will look into it. This is a possible structure of themes folder. 
as self explained, you need to place your resources such as css files and images, under respective folders listed. js folder will hold any javascript/jquery implementations (.js files) you require. Folder named 'partials' is where you need to include your view (html) content based pages. Those files will contain '.hbs' extension. In a .hbs file you can include another .hbs file as follows. For example if you have two .hbs files named createNode.hbs and createList.hbs, and you need to add the content of createNode.hbs inside createList.hbs, you need to add following in createList.hbs.

 {{> createNode .}}

By placing above, it will render the content of createNode file into the required position of createList. 

Next important thing is to know how to link specific resources( images,css,js) to a certain view(.hbs) This is done by using files inside folder 'helpers'. A helper file would list out sets of resource files required like below. But how does it map which helper file content needs to be rendered in which .hbs view? Easy, you just need to have the same name of .hbs file for the relevant helper class. For example, say you need to link the resources needed for createNode.hbs. 

1.Create a file named createNode.js under folder 'helpers' . 
2. Inside createNode.js add your resources as below. 

var resources = function (page, meta) {
    return {
           js:['test.js','jquery.js'],
           css:['bootstrap-select.min.css','createNode.css']
    };
};


This is just the very basic idea of an extension structure. But if you get the idea on how you can add resources to pages, and include various content inside pages, you are good to go !


Monday, March 23, 2015

Setting up Apache JAMES mail server

Apache JAMES is a free mail server that you  can you can use to test email capabilities that uses SMTP, POP etc. It is quite easy to set up and use for testing purposes.

You could go through following steps to install and configure JAMES mail server.

1. Download Apache JAMES from http://james.apache.org/download.cgi.
2. Unzip the downloaded directory and go to bin folder and execute the run.sh (When using Linux/mac)
3. Remember to execute the script from the root as below, else you might run into permission denied exceptions.
sudo -E ./run.sh
Note:  -E (preserve environment) option will preserve JAVA_HOME variable. (Else you might end up with JAVA_HOME  not set error)

4. Once the script is executed, you will be provided with following .

5. Next go to JAMES installation dir/ apps/ James/SAR-INF/config.xml
 Uncomment below line.
<dnsserver> 
<servers>
<server>127.0.0.1<server>
<servers>
<dnsserver>

(Note: If you need to set up a different domain, you need to set domain name at <servername></servername> , and add the relevant DNS servers at above)

6. Open another tab in command prompt and type below command.
telnet localhost 4555
Note: You need to run the script run.sh before this step and do not close it.

7. Now you will be provided with following interface.

Log in with root/root credentials, and you can use adduser command(adduser username password) to add a new user under localhost.

8. Now you can create an email address as emma@localhost(to receive mails on port 25) and test you email functionalities. Once you send an email, following path will contain the mail object JAMES dir/apps/james/var/mail/inbox/emma . To view the email been sent, setup an email client such as mozilla thunderbird. You could refer to http://kb.site5.com/email/email-software/mozilla-thunderbird/thunderbird-how-to-save-or-remove-emails-from-the-server/

Wednesday, March 18, 2015

Correlation with a BPEL process

Correlation is basically matching an inbound message with the BPEL engine, for a specific process instance. this comes in handy in certain use cases where you might use the same id for different operations. BPEL engine needs to realize which process instance needs to go to which operation. 

For example, if we are creating a purchase order through process1 and updating an order with process2 using pruchase order id. In such a case, BPEL engine needs to select the exact process instance that was  created before, to be sent to process2. When there are multiple purchase orders executed choosing the correct process instance can be an issue. This is where correlation comes into play. 

BPEL correlation sets can solve this by providing property aliases for one or more message types.We will look into how this can be done in a BPEL process.  We are going to use WSO2 BPS(business process server) to create BPEL process, therefore you would need following resources. 

Prerequisites
2. Eclipse (Kepler)
Follow the guide provided in the link to download and install the resources. 

Once the prerequisites are met, in eclipse go to File->New->Other->BPEL2.0 and create a new BPEL project. Next, right click on the project New->Other->BPEL2.0 and create a new BPEL  process file inside the created project. When this is done you will be provided with a template to draw your BPEL process. Create a BPEL process as below. 
Next go to .wsdl file created, and from design view you can  add a new process(r.click -> add operation) named 'update ' as shown below. After that generate a binding for the new process by r.click->generate binding content.

Go back to your.bpel file and click on first Reply step. In properties tab of that element , go to details section, and select operation as 'process' and get the output variable set as below.(double click on testResponseMessage so the variable field will be auto updated. )

 Next go to step 'Receive' element and go to its properties - > details section. Fill the details so that new process 'update' is selected as operation and updateRequest is set as the variable. At save, it will ask if you need to initialize, click yes. 
Go to final 'Replyoutput' step , properties-details section and fill the necessary details so that the final output will be set to updateResponse output of update operation. 

Above steps are basic steps that you would follow when creating a BPEL process, by setting where reply and receive steps should send/store the request/response. Now we move to the section where we create a correlation set at each receive step. 

1. Select first 'Receiveinput' element, and on to your right create a new correlation set. Once it's created click on Add -> New. You will be provided with the following screen. Set the name property as 'id' and select simple type from radio buttons and click on browse. From the provided list select type string. 


2.Now you have created a message property for type string. Now you need to map this message property to relevant section of the messages at your receive request. For that you need to create property aliases.  Click on New under Aliases section as shown above. Select 'testResponseMessage' input variable and create a new aliases. Same as that, create another new alias for 'updateRequest' input variable as well. 

Since you have created a correlation set and required aliases , now you need to set this correlation set at required receive steps. By doing that you will tell BPEL engine to map the relevant alias value to the created correlation property. For example map value of testResponseMessage' input variable to correlation prperty 'id'. 

3. Select first 'ReceieveInput' element and from properties tab select 'correlation' and click on Add. Make sure to set 'Initiation' property to yes in this element. That is to say that you need to initiate the correlation property in this set. 
4. Again you need to set the correlation set at your next 'receive' step. So click on that and go to 'correlations' in properties tab and click on Add. Here you need to set initiation property to No. This is because you need to use the previously initiated id at the first receive step. 

Now that you have set correlation properties, click on first Assign element and go to details section.Here we will get the input received from the first receive step, and set a message so that you could see the work of correlation when running the process. 
Click on New in details section and under From property select type expression and enter below line.
concat($input.payload/tns:input,"initiated process")
In To section select  'testResponseMessage' -> result variable. 

Go to next 'Assign1' step and in details create a new expression similar to above, but with following line. Here we will take the output of the first operation, and set it to update operation .
concat($result.payload/tns:result, "updated process") 
In To section select ' updateResponse' -> out variable. 

Finally r.click on your project go to New->Other and add a deployment descriptor file. Select 'testPort' as associated port as below and save.
Export this project as .zip and upload it to BPS BPEL list and after deploying you could see the process instance. Go to 'try it' interface and send following values as input under operation 'process' .You would be provided with following output.

input : request1             output: request1 initiated process.
input: request2              output: request2 initiated process.

Now go to operation update and provide given input, which will give you following output values.
input: request1             output: request1 updated process.
input: request2            output: request2 updated process. 

By analyzing the output you can see that the BPEL engine was able to select the correct process instance at the operation output. This was a simple example, you could look into much advanced examples which will explain the use of correlation in BPEL processes. 







Tuesday, March 17, 2015

Decoupling tasks with an Executor Service

In implementations, you might need to run certain tasks in the background , in an asynchronous manner, so that your main tasks are not interrupted.  Java.util.concurrent.ExecutorService interface provides this asynchronous behavior and is very similar to a thread pool. 

Look at a scenario as below, where the usage of ExecutorService would be useful.
1. Customer makes payment for a specific order.
2. Payment is been accepted by the bank.
3. Send email notifications to the customer regarding payment acceptance.
4. Order is sent in for delivery. 

In a use case as above, you would need to proceed step 4 after step 2. But if we are to do all this in a single thread, step 4 would have to wait until step 3 is completed. Imagine a scenario where notifications are to be sent to a large number of customers. Therefore , we can use an executor service to proceed with step 3 task, that would decouple that step from the original flow, so that both can run in parallel. 

 Basic steps of an executor service
You can create an executor service in following ways depending on your use case. You can use one of the factory methods of Executor class.

 1. ExecutorService service1 = new Executors.newSingleThreadExecutor(); 

This will only create a single thread to proceed with the decoupled tasks.If the first task is not completed when the second task arrives, the second task will be queued up until the previous is completed.

2. ExecutorService service2 = new Executors.newFixedThreadPool(5);

Here you are creating 5 thread pools, and tasks delegated to this executor service can be executed using any one of these threads.

3. ExecutorService service3 =  new Executors.newScheduledThreadPool(5);

This executor service is similar to above, just that it can schedule the tasks delegated to it .

Once you create an executor service, there are few methods you could use to delegate tasks.
 1. executorService.execute (Runnable)
    This method takes a Runnable object and executes it asynchronously. For example ,
 
executorService.execute(new Runnable() {
    public void run() {
        System.out.println("Asynchronous task");
    }
});
 
2. executorService.submit(Runnable)
This method is similar to above, just that it returns a Future type object unlike execute method.   For example,
 executorService.submit(new Runnable() {

public void run() {
        System.out.println("Asynchronous task");
    }
   future.get();  
 });
 
future.get() would return null if the method inside run() is completed.
3. executorService.submit( Callable)
This method can be used if you need to return some result after the task execution. For example,
public Object call(){
 return "test" ;
}
future.get()
In this example .get() would print "test".

Next important step to remember, is to shutdown the executor service, once you have completed your task execution. So that the threads would not keep running .

executorService.shutDown() would stop executing any new tasks, and will shut down the service once all the current threads have completed executing tasks. But if you need an immediate shut down, you could use executorService.shutDownNow() method.

Thursday, March 12, 2015

How to keep a GitHub fork repository up to date?

You might come across many instances where you will have to fork a github repository and commit your changes as pull requests to the original master repository. If you are to commit changes every now and then , you need to make sure that your local repository is up to date with the master repository. Else if there are other commits made by users and you do not have the recent changes, you might get commit conflicts once you make a pull request. So to avoid this, you need to follow two steps.

1. Configure a remote pointing to the original repository.
2. Sync your local repository with original repo.

Configure a remote pointing to the original repository
You can configure a remote  like below.

1. Open the terminal, and go to your local repository location.Next type below command.
 You can replace content in red with the url of your original repository.

git remote add upstream https://github.com/originalRepoName.git

2. Next,you can verify that a remote upstream repository got added, by giving following command.
 git remote -v

  This command should list down the remote you added in previous step.

Sync your local repository with original repo
 After you completed creating a remote upstream, now you can update your local repository by giving following commands.

1. Below command will fetch all the changes/updates of your original repository.
  git fetch upstream

2. Next checkout your local master branch.
 git checkout master 

3. Next merge the changes of your upstream repo to your local repo. This will sync your forked local repository without giving up your local changes.
git merge upstream/master 

There can be situations where your local change commits will have to be overriden to get the updates. In such a case add below commands.

git fetch upstream
git merge  master
 

or

git pull upstream master 



Wednesday, March 4, 2015

How to access default H2 database in WSO2 products

WSO2 products include a default H2 database which would contain the queries,unless you change it to another like MYSQL etc. Specially when debugging , there are instances you need to check and query results in your database tables. So this is how you can access the default H2 database.  This is an example on how to access H2 database of API manager. You could follow similar steps for other products as well.


1. Go to <APIM HOME>\repository\conf and open carbon.xml.
2. Enable the following configuration in carbon.xml.

  <H2DatabaseConfiguration>
        <property name="web" />
        <property name="webPort">8082</property>
        <property name="webAllowOthers" />
    </H2DatabaseConfiguration>

    3. Restart the server.
    4.Go to http://localhost:8082 and provide given values for the listed fields to access the DB.

    for JDBCURL:    jdbc:h2:repository/database/WSO2AM_DB
    for username :     wso2carbon.
    for password:      wso2carbon.

Service chaining with WSO2 ESB

There might be scenarios, where you need to process a BE(backend) call, get that response and pass it to another BE call. Usually when you use send mediator in your insequence,  your response can be retrieved from your out sequence. But if you need to change this behavior you need to use service chaining. 

This can easily be achieved with 'send receive' of WSO2 ESB. For an example, let us look at below synapse configuration. In this, I'm using a scheduled task, to send a quote message to a proxy every 5 seconds. In the insequence I'm sending this injected message to StockQuoteEpr endpoint which will give a response. This response will be sent to CustomSequence instead of the out sequence. For more information on this, you can refer https://docs.wso2.com/display/ESB460/Send+Mediator

<?xml version="1.0" encoding="UTF-8"?>
<definitions xmlns="http://ws.apache.org/ns/synapse">
   <registry provider="org.wso2.carbon.mediation.registry.WSO2Registry">
      <parameter name="cachableDuration">15000</parameter>
   </registry>

   <proxy name="testProxy"
          transports="https http"
          startOnLoad="true"
          trace="disable">
      <description/>
      <target>
         <inSequence>
          <send receive="CustomSequence">
            <endpoint key="StockQuoteEpr"/> 
          </send>
         </inSequence>
      </target> 
   </proxy>

   <endpoint name="StockQuoteEpr">
     <address uri="http://localhost:9000/services/SimpleStockQuoteService"/>
   </endpoint>
   
   <sequence name="CustomSequence">
      <log level="full">
         <property name="MESSAGE" value="============MSG========="/>
      </log>
   </sequence>

   <sequence name="fault">
      <log level="full">
         <property name="MESSAGE" value="Executing default 'fault' sequence"/>
         <property name="ERROR_CODE" expression="get-property('ERROR_CODE')"/>
         <property name="ERROR_MESSAGE" expression="get-property('ERROR_MESSAGE')"/>
      </log>
      <drop/>
   </sequence>

   <sequence name="main">
      <in>
         <log level="full"/>
         <filter source="get-property('To')" regex="http://localhost:9000.*">
            <send/>
         </filter>
      </in>
      <out>
         <send/>
      </out>
      <description>The main sequence for the message mediation</description>
   </sequence>

   <task name="InjectToProxyTask"
         class="org.apache.synapse.startup.tasks.MessageInjector"
         group="synapse.simple.quartz">
      <trigger count="5" interval="5"/>
      <property xmlns:task="http://www.wso2.org/products/wso2commons/tasks"
                name="soapAction"
                value="getQuote"/>
      <property xmlns:task="http://www.wso2.org/products/wso2commons/tasks"
                name="format"
                value="soap11"/>
      <property xmlns:task="http://www.wso2.org/products/wso2commons/tasks"
                name="injectTo"
                value="proxy"/>
      <property xmlns:task="http://www.wso2.org/products/wso2commons/tasks" name="message">
         <m0:getQuote xmlns:m0="http://services.samples">
            <m0:request>
               <m0:symbol>IBM</m0:symbol>
            </m0:request>
         </m0:getQuote>
      </property>
      <property xmlns:task="http://www.wso2.org/products/wso2commons/tasks"
                name="proxyName"
                value="testProxy"/>
   </task>
   
</definitions>