Creating a Node.js Application with IBM Cloud Foundry platform : For beginners
IBM Cloud , Cloud , Uncategorized / June 6, 2020

Read about Cloud Foundry here https://www.cloudfoundry.org/ Read about IBM Cloud Foundry here https://www.ibm.com/cloud/cloud-foundry   Cloud Foundry is the industry-standard open-source cloud application platform for developing and deploying enterprise cloud applications. This article will provide a basic understanding of how to work with  IBM Cloud Foundry platform- the Cloud Foundry on IBM Cloud, the tools that can be used in IBM, creating a toolchain with the development tools, and continuous delivery. Cloud Foundry is a platform-as-a-service (PaaS) on Cloud service providers that enables you to deploy and scale apps without managing servers. It can be used with a set of tools to develop applications. Cloud foundry manages aspects like scaling. The seamless connection of resources with Cloud Foundry can be achieved with IBM. Cloud Foundry as a platform is available  various service providers like IBM, AWS, and BOSH   Step 1: Create a Cloud Foundry Navigate to Cloud Foundry, click on create, and choose Node.js as runtime and choose a name and fill in the other details to create the application. Click on the create button and wait until the Application is created.  An app URL will be provided and a basic application will be created.     Step 2: Enable continues delivery…

Basic steps for creating an IBM cloud Function (Node.js Runtime) that writes to IBM Object Storage Bucket
IBM Cloud , Cloud , Uncategorized / June 4, 2020

This is a simple tutorial which helps to understand the basic of IBM Cloud functions and Object Storage. For actual development purposes, things including architecture, security factors should be addressed.     step1: Choose a Function Namespace and create Action Namespaces contain entities (e.g. actions and triggers) and belong to a resource group. actions imply the code that can be invoked with HTTP requests. Navigate to Functions->Actions -> Create Action with Node.js as Runtime. Choose the action name and Package Name. Keep the default code for now. The function main will have the logic you want to run in the function.       step:2 Create API Navigate to API and click on Create API choose options and create an API. Chose the package and the action that was created in the previous step. Create an operation with the verb. The verb indicates the HTTPS method.         Step3: Create Object Storage and  Bucket Now the Bucket needs to be created. Navigate to resources and click on Create Object. Provide necessary details like name and create object storage.     In Object storage, create a bucket. Choose the region as us-east for now Create service credentials. After creating…

Understanding serverless.yml file, properties with simple explanation, variables in serverless.yml – Serverless Framework and aws
AWS , Cloud , Uncategorized / April 23, 2020

Serverles.yml Property Description service Will have the properties which describe the service for which the YAML file is for, properties like name, arn… app Name of the app you create service for. App should be created in serverless and name can be found in serverless.com dashboard org Multiple organizations can be created in serverless and this property indicates the organization for which the service is created package Include/Exclude files or folders in the deployment package. Example,to exclude git folder,node_modules.. provider This Will have the properties for the cloud service provider. Details like name(eg:AWS),runtime(eg:Node.js), region(eg:Ohio),and others like timeout,stage(dev,prod..) and memory custom Will have custom variables for services like base, params, role… functions Will have functions in the service. Functions can have name, handler, environment variables, description… resources We will have the resources used by the service, like DynamoDB tables, lambda… output Variable for refactoring. It can be used in other serverless.yml files. This Can be used along with different app/stage/region.Example: ${output:appname:stagename:regionname:my-service.var-key}         Variables in Serverless Variables can be used in YAML file in different ways, the orange highlighted text indicates the various  ways Variables are added using interpolation, variable values are referenced using ${}.     ${self:provider.stage} –…

AppSync/GraphQL : Pagination on DynamoDB data: Curser based pagination
AWS , Cloud , Uncategorized / April 14, 2020

What is AppSync? AWS AppSync helps you create a flexible API to securely access, manipulate, and combine data from one or more data sources.AppSync uses GraphQL. AppSync can be used along with Amazon services like Lambda, DynamoDB, etc.. What is GraphQL? GraphQL is a language for APIs that enables you to query and manipulate data easily through an intuitive and flexible syntax. GraphQL provides a syntax to describe data requirements and interactions, allowing you to ask for exactly what you need and get back predictable results. This article aims at providing strategy on how to query data in the DynamoDB table with sorting and pagination. The pagination strategy used is called cursor-based pagination. Curser based pagination helps to avoid the problems of limit-offset pagination. Shifting of data on insertion and on deletion occurs while paginating in a limit-offset way, this is not ideal for real-time data. Whats is curser based pagination? Cursor-based pagination works using a pointer to a specific item in the dataset. By using this strategy major pitfalls of limit-offset pagination can be avoided.     Achieving pagination on DynamoDB data with AppSync/GraphQL Consider a case where we have an Entity Employees and each employee has multiple  Tasks. It’s a…

AWS: Create an Lambda function that returns a Presigned URL to upload an image to a S3 bucket,Node.js
AWS , Node.js , Cloud , Uncategorized / April 7, 2020

This tutorial aims at providing basic steps required to create a Lambda which will provide a Presigned URL that can be used to upload an object to the AWS S3 bucket. Upload is tested here with Postman. What is s3 ? Amazon S3 or Amazon Simple Storage Service is a service offered by Amazon Web Services that provides object storage. An object consists of a file and optionally any metadata that describes that file. S3 helps to store the objects in buckets, which are more or less like file folders.   What is  presigned URL for s3? A presigned URL gives you access to the object identified in the URL, provided that the creator of the presigned URL has permissions to access that object. That is, if you receive a presigned URL to upload an object, you can upload the object only if the creator of the presigned URL has the necessary permissions to upload that object.   What is ARN? Amazon Resource Names (ARNs)  are identifiers used to uniquely identify AWS resources       Step 1: Create a bucket to store image. Go to services->s3->create bucket Give a name like fortestinglambda Provide public access to the bucket for now and leave…

Create a Lambda and modify and deploy it using cloud9 IDE, work with npm and add Node.js packages to Lambda – Beginner’s guide
AWS , Cloud , Uncategorized / April 1, 2020

In this tutorial, we will learn to  Create an AWS Lambda function with Node.js as the Runtime environment. Import the function into AWS Cloud9(Cloud9 IDE is an Online IDE). Initialize npm, add dependency moment.js and deploy it.   Step 1 – Create a Lambda. Goto services->Lambda and click Create function Choose Author from scratch, Node.js as Runtime, provide a function name and click Create function

The above code will be provided by default. By default, the aws-sdk is available in Lambda functions and you can import it using

        Step 1 – Create an environment and  Import lambda into cloud9 IDE Cloud9 is a AWS online IDE provided by AWS. To access cloud9 move to services->Cloud9.Click on the Create environment button, provide name and description, choose default options and click on the Create environment button.   The IDE interface will look like this.   To import the Lambda function, move to AWS resource and import your remote function.               Step3 – Install Node.js dependency, moment.js. To install Node.js dependency, terminal access is required. To access terminal, go to window->New terminal and access your function folder using 

Do npm…

How to read data from AWS DynamoDB tables using AWS Lambda. get, scan and query methods(Node.js).
AWS , Cloud , Uncategorized / March 30, 2020

This tutorial aims at explaining how to read data from the AWS DynamoDB table. AWS SDK has a class AWS.DynamoDB.DocumentClient, which has methods scan, get and query, that are used to read data from the AWS DynamoDB table.  Documentation on various methods provided by the class is available from the following link AWS.DynamoDB.DocumentClient – Documentation.   Table structure primary partition key – name sort key – age table name – testTable This may not be the ideal table structure but here we use it just as an example.     step 1: Executing the query to get all data from a table – scan method. The scan method is used to  read every item in the table and returns all the data in the table. Filters can be also applied First of all, we create an object scanningPrams which has all the params including the table name, required to scan the DynamoDB table. Simple params to get 10 items from a table “testTable“, would look like this.

The output will have the following structure

The output will have three properties:  Items will have the queried items, ScannedCount will have the number of items evaluated before any scanFilter is…

Steps for creating an EC2 instance and setting up Git remote repository, with detailed descriptions.
AWS , Cloud , Uncategorized / March 26, 2020

Here are the steps to create an EC2(Amazon Elastic Compute Cloud), set up your Git remote repository in EC2 and connect to the remote repo from your local machine. Some explanations are provided for better understanding. AWS Elastic Compute Service(EC2) is IaaS(Infrastructure as a Service).   PART 1 – Creating and connecting to AWS EC2 Virtual Machine instance.   STEP 1  – Go to AWS console. STEP 2 – goto services, click on EC2 and create a virtual machine by clicking on the launch instance button. EC2 is IAAS just like a compute engine in google. STEP 3 – Amazon has provided templates for engines with different OS and processor combinations. Choose one option and move forward. Select according to your needs and click on launch. STEP 4 – On clicking the launch button amazon asks to create or choose a  key-value pair. A key pair consists of a public key that AWS stores, and a private key file that you store. You have to provide the private key name. A .pem file with the key will be downloaded which is the private key. Public-key cryptography, or asymmetric cryptography, is a cryptographic system that uses pairs of keys: public keys which may be disseminated…

What is Cloud Computing and Why? Cloud Part – 1
Cloud / November 18, 2019

Welcome to Cloud Series. This is the first article in the Cloud Series and aims at explaining what is cloud and its benefits to absolute beginners. Business organizations require plenty of computing resources/IT resources to run their businesses. These resources include servers, storage, network, various software, services, etc which are necessary for setting up and maintaining the business services provided by the organization. IT/Computing Resources examples   For example, consider an organization which provides mobile and web application to serve its customers. To successfully set up and run these service applications, the company needs servers, storage, network, third party services, softwares,  and various other IT resources. These IT resources should be managed effectively and reliably so that the services are always available without causing any inconvenience to the customers.   Traditional Way  Traditionally, the way to get these resources is to buy them or to rent them from different managed service providers.Buying/Renting the resources is an expensive way as the cost for setting up infrastructure, managing the resources and cost for keeping a large number of professionals and the effort required to effectively manage and co-ordinate the resources will be high. For example, an organization needs to buy servers for…

Understanding Cloud Deployment Models. Cloud Part – 2
Cloud , Uncategorized / November 21, 2019

For part 1 of this series visit the following link What is Cloud Computing and Why? Cloud Part – 1   Cloud Deployment Models represent the types of cloud environments used by business organizations depending upon access, ownership, and size. There primarily  4 types of cloud deployment models Public Cloud Private Cloud Hybrid Cloud Community Cloud All these models provide various IT Resources as services Public Cloud A Public cloud is a publically accessible cloud environment. Resources and their maintenance is run by Cloud Service Provider (CSP) for a cost. The services are scalable, flexible and used by multiple organizations. Major Public Cloud Service providers are Amazon, Google, and Microsoft.   Some of the major negatives of public cloud services are Ownership of data rests with the cloud service providers. No exclusive custody. Limited control as a whole of the resources is handled by the service provider. Fear of lack of security. Multiple organizations access services.     Private Cloud A private cloud environment is owned and used exclusively by a single organization. A private cloud enables and organization to provide centralized access to IT resources for different parts/sections/departments of a business organization. Private cloud service models can be broadly divided into…

Share this page in social media platforms