Adsense

May 12, 2021

Lightning Web Component


What is Lightning Web Component?
Lightning components can be built using two methodologies namely lightning aura component and Lightning web component. For end users both are same.
Lightning Web Components is a new framework for building Lightning components. For Lightning Web Components development you need to setup SalesforceDX and VsCode as Lightning web component are currently not available in Developer console.

In 2014, Salesforce launched the Lightning Component framework supported by the Aura programming framework. Since the web standards offered limited feasibility to build large-scale web applications at the time, Aura came with its component-driven model that allowed developers to build large-scale client applications on the web.

The web stack has seen an unprecedented level of innovation and standardization that transformed it from being a rudimentary page-rendering platform to a web development platform. This is the reason salesforce came with leveraging the offerings of web and let the heavy lifting to browser and code on top of it, which led them to created Lightning web component.

Introduction:
Lightning Web Components (LWC) is a stack of modern lightweight frameworks built on the latest web standards. It is a DOM (Document Object Model), element created through reusable code and is used to generate a dynamic interface without using JavaScript or building a Library. This feasibility makes it quick and seamless, saving the developers a ton of time and effort on the Web Stack. Let’s look at some of its remarkable features:
  1. Improved performance of the component as most of the code is recognized by the native web browser engine and web stack.
  2. Ability to compose applications using smaller chunks of code since the crucial elements that are required to create a component is part of the native web browser engine and web stack.
  3. Increase in the robustness of the applications built using LWCs as they are inclusive of the said modern web standards.
  4. Parallel interoperability and feasibility to use both Lightning Web Components and Aura components together in the applications with no visible differentiation to the end-users.

           
Web Stack Transformation:
Let's have a look on the web stack transformation which helped salesforce to coin new concept of building lightning components(LWC).
Web Stack Transformation
Web Stack Transformation


While developing prior to 2019, there use to be a layer between web stack and UI. The mid-layer languages basically comprise of javascript, which is now part of modern web standards.

With browser getting mature, it give more power to our web Stack to create a lightning UI component, It doesn’t require a mid-Layer to the browser which impacts our speed and performance. This one is the main reason developers are struggling within Aura.

Aura-based Lightning components are built using both HTML and JavaScript, but Web Component(LWC) is built directly on the Web stack.

Creating an LWC is fast as it no longer requires the user to download the JavaScript and wait for the engine to compile it before rendering the component.

Lightning Web Components supports the same browsers as Lightning Experience.

Note: Development of LWC component is not supported in Developer Console.

Read More »

January 17, 2021

How to exclude files from deployment - Mule4

Hello Friends,

In this article, I want to show to you how you can skip or exclude flow from initialize and start while deploying mule 4 applications in local or in cloud. Also, I will show you how you can exclude files from packaging jars.

Problem: 

In mulesoft development there are high chance of having large number of files for flows/ subflows and due to this it leads to:
  • Long build and deployment time.
  • Bulk packaging.

Above scenarios are more painful when we need a quick fix and testing of code in local or in cloud environment.


Let's check scenario below, as we are able to see in project below we have three files under src/main/mule package. Each file is having individual flows as highlighted in picture below:



mule4-flows



In mule4, we have "mule-artifact.json" file where we can declare files to be considered for initializing and starting of the flows under configs tag. By default, if we don't declare any file names, Mule runtime includes All files to check, initialize and starting of the flows.


mule-artifact-json.png


Below logs shows only "HelloWorldFlow" is initialized and started for deployment.

flow-logs




So, if we don't declare any file to include for execution, below logs shows mule runtime started and initialized all flows from all files.
flow-logs



Although above configuration helps you to exclude flows to initiate and start while deploying application in local or in cloud. It helps to reduce total deployment time of mule apps. But with above configuration you will not be able to exclude files or folders from packaging jar.


How to exclude files or folders from packaging jars?


To achieve this, we need to create "_muleExclude" file at the root level of mule application.




_muleExclude-mule4.png



You can declare file names to exclude from packaging jar as shown above.
Above picture also shows extracted jars is not having files included inside "_muleExclude" file.

Note: Please make sure above configurations should be used wisely as it may lead to application failure.

If you want to know more about minMuleVersion you can check my article runtime-patching-in-mule-munit


Happy Learning :)
Read More »

January 14, 2021

Filter complex array using dw2-Mule4

Hello friends,

In this article I will show you how we can filter results from complex Arrays. In this use-case we will use select and descendent operators from Mulesoft dataweave.

In this use-case we will try to find out the Book details which is at-least rented once from Library. We will see some variations to dataweave expressions in below section.



Input Payload:

[
  {
    "bookId": "20200114",
    "bookType": "Fiction",
    "title": {
      "en": "Candide"
    },
    "message": {
      "en": ""
    },
    "bookDetails": [
      {
        "label": {
          "en": "Candide"
        },
        "Library": {
          "city": "Pune",
          "rented": {
            "count": "1"
          }
        }
      }
    ]
  },
  {
    "bookId": "20200115",
    "bookType": "Fiction",
    "title": {
      "en": "The Alchemist"
    },
    "message": {
      "en": ""
    },
    "bookDetails": [
      {
        "label": {
          "en": "The Alchemist"
        },
        "Library": {
          "city": "Kolkata",
          "rented": {
            "count": "0"
          }
        }
      }
    ]
  }
]

DW Script:
%dw 2.0
output application/json
---
payload[?($.bookDetails.Library.rented.count[0] >= "1")] default []


Output:
[
  {
    "bookId": "20200114",
    "bookType": "Fiction",
    "title": {
      "en": "Candide"
    },
    "message": {
      "en": ""
    },
    "bookDetails": [
      {
        "label": {
          "en": "Candide"
        },
        "Library": {
          "city": "Pune",
          "rented": {
            "count": "1"
          }
        }
      }
    ]
  }
]


Now let us understand the dataweave script in some details:

filter-complex-array-dw-script
DW-Script

We can add multiple conditions inside red brackets based on our requirement.
For example if we want to add filter books based on Library location. We can use
expression like below:

%dw 2.0
output application/json
---
payload[?(($.bookDetails.Library.rented.count[0] >= "1") and ($.bookDetails.Library.city[0] == "Pune"))] default []
%dw 2.0
output application/json
---
payload[?(($.bookDetails.Library.rented.count[0] >= "1") and ($.bookDetails.Library.city contains "Pune"))] default []

Here, important thing to understand about [0] inside red-brackets. Actually we either need
to use [0], because as we have already said in diagram that it iterates and evaluates each
record and give output result or contains operator to evaluate the condition. Also, when
we are having any count check and compare like "> or <" contains operator will not work.


When we are using selectors for filtering data with one more variation in expression.
Let's see some more variation where we can use descendent operator to evaluate
the results.

%dw 2.0
output application/json
---
payload[?(($..count[0] >= "1") and ($..city[0] == "Pune"))] default []
%dw 2.0
output application/json
---
payload[?(($..count[0] >= "1") and ($..city contains "Pune"))] default []


Results will remain same with expression used with descendent and contains operator.

Happy Learning:)
Read More »

January 10, 2021

How to lookup CSV table using Dataweave2

 Hello Friends, 

In this article we will see how to lookup CSV file data without converting it to xml/json or any other message format.

Let's take one example where we need to lookup data from CSV file and retrieve/filter results from the dataset. Here, we will take one very common dataset with employee details information and then we will filter the data with some set of desired condition.


CSV table Input

Name Location Pincode
Niral Pune 910002
Nikku Pune 910005
Shruthi Bhopal 843001
Manpreet Chandigarh 890021
Little John Mumbai 200011
Harry Delhi 100011
Tom Goa 500110

DW Script:

%dw 2.0
output application/json
---
(payload filter ((item, index) -> item.Location == "Pune"))


Output:
[
  {
    "Name": "Niral",
    "Location": "Pune",
    "Pincode": "910002"
  },
  {
    "Name": "Nikku",
    "Location": "Pune",
    "Pincode": "910005"
  }
]


In above example we are filtering data based on location. Here, we have not converted csv data in json before using filter. You can also store the results in dataweave vars and use those results based on your use-cases. 

Please find below some variations having same output as results:
%dw 2.0
output application/json
---
payload filter ($.Location == "Pune")


You can have multiple conditions also. Below expression, we are retrieving data from csv file inside project. You can use readUrl to load csv data from classpath and then use filter  based on Location and Pincode.
 
%dw 2.0
output application/json
var fiterResult= readUrl("classpath://csvFiles/ProfileDetails.csv","application/csv") filter ($.Location == "Pune" and $.Pincode == "910005")
---
fiterResult

Output:
[
  {
    "Name": "Nikku",
    "Location": "Pune",
    "Pincode": "910005"
  }
]



Happy Learning :)

Read More »

January 05, 2021

Validation module indepth - Mule4

 Introduction:

Mulesoft has provided a bunch of validation components which is very useful for message field validation during developments. These validations can be general or based on specific requirements. Validators are used to verify that parts of the Mule message/events to meet some kind of criteria which we specify.

If a message doesn't meet the validation criteria, validation component generates Validation error. Here, we can customize the exception message that is displayed in logs/responses.

Let's see some of important and frequently used validation components

mule4-validation-module
mule4-validation-module

Is not null:

This components checks if the field value should be not null or else throws' error with "Name is Mandatory field"

<validation:is-not-null value="#[payload.name]" message="Name is Mandatory field"/>

Is email:

This components checks if the incoming email field is in valid format or not else throws validation error with message "Invalid Email format".

<validation:is-email email="#[payload.email]" message="Invalid Email format"/>

ALL vs ANY

To perform multiple validation at a time in mule flow. Mulesoft provides two scopes ALL and ANY which is used for these purpose.

ALL:- All validation should be passed from a set validation components defined in ALL scope.

<validation:all>
      <validation:is-not-null value="#[payload.name]" message="Name is Mandatory field"/>
      <validation:validate-size value="#[payload.name]" min="5" max="20" message="Name field size should be between 5 and 20"/>
</validation:all>

Here we are checking name field should not empty and name length should be minimum length 5 and max 20. It is similar to AND conditions like any programming language.

ANY:- Any of the validation components should be passed from a set of Validation components defined in ANY scope

<validation:any>
  <validation:is-not-null value="#[payload.name]" message="Name is Mandatory field"/>
  <validation:validate-size value="#[payload.name]" min="5" max="20" message="Name field size should be between 5 and 20"/>
<validation:any>

With Any scope we are ensuring any of the above condition should pass, it will pass the the validation. It is similar to OR conditions like any programming language.

Based on project requirement we can used these scopes with multiple validation components at a time.


Use of validation module Dataweave:

Now let us see how we can use these mule validation components can be used as Mule expression to make decisions without throwing any validation errors.

<choice>
   <when expression="#[ Validation::is-not-null(payload.name) and Validation::is-not-null(payload.name)]">
	<set-payload value='#[payload.name ++ " is a valid name."]'/>
   </when>
   <otherwise>
        <set-payload value='#[payload.name ++ " is NOT a valid name."]'/>
   </otherwise>
</choice>


Happy learning :)

Read More »

January 01, 2021

WSO2 Basic Overview

WSO2 Introduction:

WSO2 is an open-source technology provider which offers an enterprise platform for integrating application programming interfaces (APIs), applications, and web services locally and across the Internet.

WSO2 solutions give enterprises the flexibility to deploy applications and services on-premises, on private or public clouds, or in hybrid environments and easily migrate between them as needed.

WSO2 Platform:

WSO2 platform provides 3 main products to design and maintain application.

API Manger, Enterprise Integrator and Identity Server

WSO2-products
WSO2-products



API MANAGER

Api Manager helps to design and consume APIs. API Gateway helps to manage API requests and apply policies to them. API rate limiting feature helps to regulate traffic and secure against security attacks. API security provide Authentication and authorize features to API requests. 

API Manager provides Analytics feature to monitor application behavior and 

API management platform is available for on-premises, cloud, and hybrid architectures.




ENTERPRISE INTEGRATOR
Enterprise integrator provides hybrid integration platform. It enables API-centric integration using integration architecture styles such as microservices or centralized ESB.
 
WSO2 Enterprise Integrator is having 2 different architectural styles:
  • Micro Integrator: It is event-driven standard message based engine. It supports message routing, transformation and other type messaging use-cases. Micro integrator is used when you need centralized, API-driven integration architecture. Micro integrator can also be used where you need a decentralized cloud-native integration like Microservices and where you need low-code integration and application monitoring.


















  • Streaming Integrator: It is very useful for performing ETL (Extract, Transform, Load) and where streaming operations required. It provides real time analysis of streamed data with the help of SQL-like query languages. Streaming integrator is fully compatible with systems like Kafka and NATS to full utilize of stream data. It has native support to work with WSO2 Micro integrator to trigger complex integration flows.


Benefits of WSO2 ESB

  • Ensures loosely coupled solution with easy to manage plug-ins.
  • Suited for Scalable design.
  • It provides agile incremental solutions for development, early/often deployment, and seamless change management.
  • Configuration driven design, as opposed to code driven design.

IDENTITY SERVER

WSO2 Identity server is an open source API-driven tool which provides CIAM(Customer identity and access management) solutions effectively.
It is extensible, highly scalable platform which helps to manage identities at both cloud and enterprise environments. With CIAM solutions, it gives enhanced, seamless, and secure digital experiences.






Happy Learning :)
Read More »