going serverless: a review - ijrter · for example, amazon web services have their own api gateway....

13
DOI:10.23883/IJRTER.2018.4277.98BLJ 579 Going Serverless: A Review Annie Ahuja Department of Computer Science,Guru Nanak Dev University AbstractComputing power has gained momentum from the last few years with the incorporation of supercomputers. Moreover, the increased pace of using Internet and its services has paved the way for the emergence of computing technologies. Still some issues need to be resolved in terms of data storage utilizing minimal resources and cost and also the time has come now to move beyond clouds with ‘serverless’ paradigm. With the advent of cloud computing technology, in the current scenario organizations or individuals are no more facing the issue of storing of abundant flow of data. The term ‘Serverless’ in Serverless Computing does not mean that servers are not involved but implies that from now onwards developers need not to worry about the deployment of servers rather they are managed by the serverless platform providers. This paper discusses the serverless computing technology and gives an insight details in terms of architecture, parametric analysis of commercial platforms provided by different serverless providers, pros, cons and challenges. KeywordsCloud computing, Serverless Computing, architecture, IaaS, SaaS, PaaS, AWS Lambda, Google Cloud Functions, IBM Bluemix Open Whisk, Microsoft Azure, BaaS, FaaS, Greener computing I. INTRODUCTION OF SERVERLESS COMPUTING The Cloud Computing technology emerges as a light in terms of reduced infrastructure and operational costs, which are always one of the key factors for the seamless working of any organization. Serverless computing introduces a new dimension to the existing cloud computing technology. It comes out as a compelling paradigm for the existing cloud computing deployment applications as it effectively manages the recent shifting in applications to containers and microservices. Most of the companies are running applications using cloud technology as it reduces the expenses incurred due to paying of the bill for usage only. It is also demonstrated by conducting multiple studies in various industries with different application types that this transition in the form of migration to the cloud architecture reduces the ownership cost and further improves time to market scenario with the rapid changing market demands [8]. With the adoption of cloud computing, there is no more a concern for the purchasing and maintenance of the hardware for the companies but it remains an issue for server based architecture as its foremost requirement is to provide scalability and reliability to the system architecture. With the evolving applications, the companies need to take care of the challenges involving patching and deployment to the group of servers. Scaling of the group of servers on server side should be done in such a manner that they can efficiently handle the scale up scenario that is peak load time and scale down scenario when the traffic goes down depending on the load as it affects the cost. The process of scaling is performed in such a way that it should not affect the reliability of the architecture for the end users by maintaining integrity of system. When the servers remain idle that is not properly utilized then it is no more a cost effective scenario as wastage of resources occurs as well as you have to pay for that usage which you do not even utilized. Analysts estimated that around 85 percent of servers have not utilized to the full potential [8]. The addressing of these challenges can be effectively taken care by the emerging ‘serverless computing’.

Upload: others

Post on 22-May-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

DOI:10.23883/IJRTER.2018.4277.98BLJ 579

Going Serverless: A Review

Annie Ahuja

Department of Computer Science,Guru Nanak Dev University

Abstract— Computing power has gained momentum from the last few years with the incorporation

of supercomputers. Moreover, the increased pace of using Internet and its services has paved the way

for the emergence of computing technologies. Still some issues need to be resolved in terms of data

storage utilizing minimal resources and cost and also the time has come now to move beyond clouds

with ‘serverless’ paradigm. With the advent of cloud computing technology, in the current scenario

organizations or individuals are no more facing the issue of storing of abundant flow of data. The

term ‘Serverless’ in Serverless Computing does not mean that servers are not involved but implies

that from now onwards developers need not to worry about the deployment of servers rather they are

managed by the serverless platform providers. This paper discusses the serverless computing

technology and gives an insight details in terms of architecture, parametric analysis of commercial

platforms provided by different serverless providers, pros, cons and challenges.

Keywords— Cloud computing, Serverless Computing, architecture, IaaS, SaaS, PaaS, AWS

Lambda, Google Cloud Functions, IBM Bluemix Open Whisk, Microsoft Azure, BaaS, FaaS,

Greener computing

I. INTRODUCTION OF SERVERLESS COMPUTING

The Cloud Computing technology emerges as a light in terms of reduced infrastructure and

operational costs, which are always one of the key factors for the seamless working of any

organization. Serverless computing introduces a new dimension to the existing cloud computing

technology. It comes out as a compelling paradigm for the existing cloud computing deployment

applications as it effectively manages the recent shifting in applications to containers and

microservices. Most of the companies are running applications using cloud technology as it reduces

the expenses incurred due to paying of the bill for usage only. It is also demonstrated by conducting

multiple studies in various industries with different application types that this transition in the form

of migration to the cloud architecture reduces the ownership cost and further improves time to

market scenario with the rapid changing market demands [8]. With the adoption of cloud computing,

there is no more a concern for the purchasing and maintenance of the hardware for the companies but

it remains an issue for server based architecture as its foremost requirement is to provide scalability

and reliability to the system architecture. With the evolving applications, the companies need to take

care of the challenges involving patching and deployment to the group of servers. Scaling of the

group of servers on server side should be done in such a manner that they can efficiently handle the

scale up scenario that is peak load time and scale down scenario when the traffic goes down

depending on the load as it affects the cost. The process of scaling is performed in such a way that it

should not affect the reliability of the architecture for the end users by maintaining integrity of

system. When the servers remain idle that is not properly utilized then it is no more a cost effective

scenario as wastage of resources occurs as well as you have to pay for that usage which you do not

even utilized. Analysts estimated that around 85 percent of servers have not utilized to the full

potential [8]. The addressing of these challenges can be effectively taken care by the emerging

‘serverless computing’.

Page 2: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 580

1.1.Defining Serverless

The term ‘serverless’ in the context of cloud states that now developers need not to have a concern

for the deployment of servers as well as lot more abstraction is provided with the serverless

solutions. Not only servers are hidden but the functions that run on servers and the process of scaling

all gets hidden. The searching of the term ‘serverless’ has gained momentum in the last 5 years

which is reported by Google Trends as represented in Figure1 [3].

Figure1. Google trends [3]

In serverless computing, functions form a core part in defining services which is termed as Function-

as-a-Service (FaaS) introduced by cloud vendors such as Amazon [26], Google [23], IBM [27]. For

example, Amazon Web Services (AWS) Lambda, offers serverless computing services for the

execution of application’s code using function containers that are event triggered [16]. The container

holds all the components that are required in order to provide compatibility to run such source codes.

Therefore, the developing team can solely concentrate on developing back-end code for business

process and then deployment takes places without paying heed that how to maintain infrastructural

components. Traditionally server is in always on state that is in running state but with serverless

computing, server is in on state only when request is invoked for function code, therefore on state

refers to the execution state of an application but not the resource allocation [3][5] .So, in this

perspective serverless computing helps in reducing cost in comparison to the traditional system.

The other aspect of FaaS is related with an ‘API Gateway’. An API (Application Programming

Interface) Gateway is an HTTP (Hyper Text Transfer Protocol) server which has defined routing

points or destination points in its configuration. Moreover, each routing point is associated with a

FaaS function. Whenever request is made to an API Gateway, the first task it performs is to identify

the configuration of routing and then call the specific FaaS function related to request being made. It

basically allows the mapping from HTTP request parameters to FaaS function input arguments. Then

API Gateway received the outcome from FaaS function and transforms it to HTTP response and

finally returns back to the requester who made that request. For example, Amazon Web Services

have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted

microservices in a Serverless way and also includes scaling, and other benefits that come from FaaS

functions [9].

Like FaaS, the other aspect of serverless architectures is BaaS known as Backened as a Service that

implies those applications that rely on third party services [6]. BaaS is basically used to provide a

backend for applications and mostly to mobile applications, therefore sometimes referred to as

MBaaS. The purpose is to provide an API and tools in order to provide integration with the

application backend. One of the key differentiator between SaaS and BaaS is that SaaS’s targeted at

Page 3: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 581

end users and BaaS at developers as of now. The additional alluring services provided by BaaS

include Push notifications, social integration, etc. The most popular BaaS are Parse and Firebase.

Parse is most popular one as acquired by Facebook for providing integration with the programming

languages and also provides services that are required by an application while BaaS is acquired by

Google and deals with real time apps and provides storage as well [33].

Even the evaluation is going on in terms of experiments and the goal of experimenting is to find

whether serverless computing can make its existence for neural network inference. So you can well

imagine the boundaries of this boundary less emerging broader prospect [17].

An application of serverless architecture is filtering of chat comments in real-time [10]. With the

server approach, each message of a chat is received by app maker’s server, then parsed and get

republished in the chat area. It sounds good but it works appropriately for only small number of users

who are interacting simultaneously. But with the serverless approach, function code is written for

filtering the chat messages then provider or vendor wraps the function into a container as container

can easily be managed in terms of monitoring, cloning and distribution to any number of servers. The

mechanism followed is developer provides the routing of all chat messages which are to be filtered to

the provider, who in turn runs the respective number of containers to ensure that logic will not be

compromised at any level of scale.

II. CHARACTERISTICS OF SERVERLESS PLATFORM

Following are the characteristics of serverless paradigm provided by different serverless platforms.

The critical analysis of these features helps the developers to choose appropriate platform of their

requirement [3][5][7].

2.1. Pricing model:

The pricing model varies with the different providers such as discount offers based on peak or off

peak timings. In serverless computing, the key component is serverless functions and users’ usage is

considered only when serverless functions are executing for that specific time and resources that get

consumed in that time span.

2.2. Programming model:

The typical programming model which is currently in use for serverless platforms involves the

execution of a main function. Composition of functions is done in such a manner that it supports

cloud elasticity. While executing serverless functions do not maintain state, that’s why referred to as

stateless functions. In order to retrieve or update any state of a function, the developer can write code

for that also.

2.3. Programming languages:

Programming languages supported by serverless platforms include C#, Java, Javascript, Python, Go

and Swift. One or more than one programming language can be used with most of the platforms.

2.4. Extensibility:

Extensibility is also provided in some of the platforms but the requirement is to present the written

code in a Docker image as it has well-defined API (Application Programming Interface).

2.5. Deployment:

Deployment of serverless platforms should be as simple as possible. For deployment, developers

provide function source code in a file. The other ways to implement deployment include archiving

of code inside multiple files or presented as a Docker image.

Page 4: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 582

2.6. Monitoring of bugs:

The print statements are used to implement the debugging in a most basic way in serverless

platforms. These statements get saved in execution log for referencing later on, if needed. From

developers perspective, it is quite useful as it provides ability to identify the state of being blocked,

even error tracing and in determining better possibilities for execution of function. These are all

additional capabilities that may or may not be provided by all serverless platforms.

2.7. Security:

In attaining and maintaining security, execution of functions must be performed in an isolated

manner between the users as serverless platforms are multiple client organizations that is supports

multitenancy which are served by single instance of a server like Software-as-a-Service (SaaS)

vendor.

2.8. Complexity:

Composing functions plays a significant role in serverless platforms. The platforms offer some

mechanism by using which one function can be invoked from another. The additional capability is

provided by some platforms in terms of high level mechanisms by using which it is lot more easily to

construct complex apps by using composibility of functions.

2.9. Boundaries:

There are constraints that limit the performance of serverless code. Boundaries considered as limits

includes handling of requests in a concurrent manner on run time, amount of memory available and

consumed resources of CPU for invoking the function. With the increasing number of users, these

boundaries may get more bounded by reaching the threshold limits and some boundaries get imposed

because of the platform like size of memory available that are under category of inherent drawbacks

as they are platform specific.

III. EVOLUTION OF SERVERLESS COMPUTING

The emergence of cloud computing that came into existence with “as a service” about a decade ago

[4] and then emerges serverless computing that paves a way for the new generation of Platform-as-a-

Service (PaaS) [1]. With the increase in magnitude of cloud computing, hyperscale datacenters

replaced the traditional datacenters that involves high level of virtualization. With the significant

increase in the use of virtualization concept, it greatly affects the two aspects- reduction in

underutilization and effective increase in manageability [15]. From here, initiation of sharing

evolution notified as it enables the sharing of common hardware. A server is capable of running

many virtual machines but still the concern is each virtual machine has to run a full copy of operating

system as yet only hardware sharing is possible. Then OS-level virtualization comes with containers

by using which resource sharing becomes possible. Container contains the entire set of components

that are required to run a particular software program and also contains a minimal subset of OS [11]

which was the key requirement, but the result was not fascinating as it constrained physical

resources. Despite all these improvements in technology, still the limitations in terms of

infrastructural components that are servers persist. It enables the emerging of whole scenario that we

call it today ‘serverless computing’.

The concept ‘serverless’ exists but gained popularity by Amazon in the year 2014 with “Getting

Started with AWS Lambda” [3]. This was the first platform under the category of serverless

platforms and the fully featured one. Then in the year 2016, arrival of Google Cloud Functions [22],

Microsoft Azure Functions [32] and IBM Open Whisk [28] by their respective vendors emerged.

The concept of serverless emerged as step by step in the form of adopting virtual machines, then

Page 5: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 583

container technologies, each and every advancement provides and becomes abstraction layer in the

computational terms such as resource consumption, incurred cost, developing and deploying speed.

Some of the services provide the ability to execute some part of the code on server side without even

managing the servers. This is all possible with the cloud functions. One of the examples of using this

service is Facebook’s Parse Cloud Code [3]. But its use is limited to mobile use cases.

The server side execution of function code may be supported by Software-as-a-Service (SaaS) but

that support is limited to application domain. The additional capability in terms of hosting of

arbitrary code is provided by some of the SaaS vendors as it allows the invoking of arbitrary function

code that is present in some other location but gets invoked with an Application Programming

Interface (API) call. This approach is implemented by the Google Apps Marketplace in Google Apps

for Work [3]. The more and more it evolves, the better and best perspectives it will definitely

showcase in the coming future.

IV. ARCHITECTURE OF SERVERLESS COMPUTING

Architecture describes the style of designing and depicting methods in order to perceive that layouts.

In the architecture of serverless paradigm [3][9], the capacity decisions are taken care by the

different platforms provider starting with the scaling up and down with respect to the workload in

server capacity .This entire scenario makes the computational environment abstract in nature as you

are not aware about the execution of stateless functions. The key aspect of serverless architecture is

event processing as event occurring is associated with user defined functions. It determines which

function(s) used to trigger the event, instance creation or finding an existing instance of function,

then send an event over Hyper Text Transfer Protocol (HTTP) and the next step is to wait for a

response, then collect execution logs, after that response is made available to user, and at last

function is stopped if it is not needed.

The major challenge is to implement this entire scenario in serverless platform as represented in

Figure2 [3]. Furthermore, consider the key factors as well like cost, scalability, and fault-tolerance to

maintain system reliability and integrity. The platform should respond efficiently to the occurrence

of events whether it is starting of a function and processing its input, queuing of events and schedule

the execution of appropriate functions corresponding to state of the queues and arrival rate of events

and then also manage to stop and free the resources if the state of function is idle. The platforms also

must not forget about scaling and managing failures in a cloud environment [18].

Figure2. Architecture of Serverless Platform

Master Edge Worker

UI (User

Interface) code Function main (){

Return { payload :

“Hello World” };

}

Disp

atcher

Even

t Queu

e

code API Gateway

Worker

Cloud Event

Sources code code

Page 6: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 584

V. COMMERCIAL PLATFORMS

Some of the notable existing serverless commercial platforms provided by different vendors are as

follows:

5.1. AWS Lambda:

AWS Lambda by Amazon [12] was the first serverless platform which is fully featured and supports

languages such as Node.js, Java, Python, and C#. It considers several key factors that include

incurred cost, deployment, programming model used, security issues, limits on resources and

monitoring of platform. It is easy to use Lambda functions for handling events due to the availability

of large AWS services ecosystem of scalable services as serverless functions are stateless in nature

so this ecosystem enables the successful deployment of serverless applications and enables ease of

integration and also having the capability to provide composition to functions.

5.2. Google Cloud Functions:

Google Cloud Functions [23] is the serverless platform launched by Google. It responds to Google

based Cloud Services as Google Cloud Functions has FaaS for running serverless functions that

gives response whenever HTTP call or some event occurs. Serverless functions are coded in Node.js.

5.3. Microsoft Azure Functions:

Microsoft Azure Functions [32] is next in a row of serverless platforms that provides integration with

Azure services of Microsoft for running functions provided by user. The supported languages include

C#, F#, Python, PHP, Node.js, etc. The key differentiator in these functions is the availability of

runtime code on GitHub under MIT license as it is open source. Furthermore, these functions ease

the process of debugging by providing local development experience for implementing Azure

functions from the scratch.

5.4. IBM Bluemix Open Whisk:

IBM Bluemix Open Whisk [2] supports event based programming and also provides composibility

of functions which is required in high level constructs implementation. It is available on GitHub

under the license of Apache open source. It supports Java, Swift, Python, Node.js as well as Docker

container facility that hold arbitrary binaries which is the key point differentiator from AWS Lambda

and Google Cloud functions. Cloud functions of IBM works with JavaScript framework on

serverside as well as with Apple’s currently popular swift language [30].

Table1 represented the Parametric Analysis of Commercial Serverless Platforms. Comparison is

based on the following parameters [2][12-15][19-29][32][34]:

Clouds: The cloud service used for the data storage.

Programming Languages: The languages that are supported by the platform.

Scalability: Automatic Scaling of the traffic load or intervening manually.

Log Management: The purpose of this tool is to manage the activity of functions.

Number of functions (max): The number of function in maximum limit that can be deployed

per project.

Number of concurrent functions (max): The maximum number of times a single function can

be concurrently invocated

Duration of function (max): The duration of function represents the amount of time for which

function runs before getting terminated.

Consumed memory (maximum) per function: It represents the maximum uasage of memory

for a function.

Page 7: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 585

Deployment size (maximum): It represents the maximum limit of function’s deployment.

Authentication: It validates the right in order to access a function.

Table1. Parametric Analysis of Commercial Serverless Platforms

Sr.No. Parameter AWS Lambda Google Cloud

Functions

Microsoft

Azure

Functions

IBM Bluemix

Open Whisk

1. Clouds AWS(Amazon

Web Services)

Google Cloud Microsoft Azure IBM Bluemix

2. Programming

Languages

Supported

languages include

Node.js, Java, C#,

Python

Supported

languages include

JavaScript

Supported

languages

include C#, F#,

JavaScript,

Node.js, PHP,

Python

Supported

languages

include

JavaScript,

Swift. Swift is

currently

Apple’s

programming

language

3. Scalability Automatic scaling Automatic

scaling

Manual/

Automatic

scaling

Automatic

scaling

4. Log Management Cloud Watch Google

Stackdriver

Azure App

Service

IBM Bluemix

Open Whisk

Dashboard

5. Number of

functions

(maximum)

Maximum number

of functions

supported is 1,500

Maximum

number of

functions

supported is

1,000

Maximum

number of

functions

supported is not

known

Maximum

number of

functions

supported is

not known

6. Concurrent

functions(maximum)

Maximum limit of

concurrent

execution of

functions is 10,002

Maximum limit

of concurrent

execution of

functions is

10,002

Maximum limit

of concurrent

execution of

functions is not

known

Maximum

limit of

concurrent

execution of

functions is

1,000

7. Duration of function

(maximum)

The maximum

function duration is

300 seconds

The maximum

function duration

is 540 seconds

The maximum

function duration

is 600 seconds.

By default, it is

300 seconds. But

can be increased

to 600 seconds

by making a

change in the

property

functionTimeout.

The maximum

function

duration is

0.1-300

seconds

8. Consumed memory

(maximum) per

function

Maximum limit of

memory available

per function is 1.5

GB

Maximum limit

of memory

available per

function is 2 GB

Maximum limit

of memory

available per

function is 1.5

GB

Maximum

limit of

memory

available per

function is 512

MB

9. Deployment size Maximum Maximum Maximum Maximum

Page 8: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 586

(maximum) deployment size

used is- 50 MB

(compressed) for

Lambda function

Deployment

package size 75GB

for Total size of all

the deployment

packages that can

be uploaded per

region

250 MB

(uncompressed)

Size of code that

you can zip into a

deployment

package

deployment size

used is -100 MB

(compressed) for

sources

500 MB

(uncompressed)

for sources and

modules

deployment size

used is not

known

deployment

size used is -

48 MB

10. Authentication AWS Account root

user, AWS IAM

(Identity Access

Management) user

and IAM roles used

to provide

authentication

IAM (Identity

Access

Management)

roles used to

provide

authentication

Azure App

Service used to

provide

authentication

API Gateway

is used to

provide

Authentication

VI. BENEFITS OF SERVERLESS PARADIGM

Serverless computing has come out with a wide potential source for managing variety of

information. Following are the major benefits of this emerging paradigm: [3][9][31]

6.1. Development Effort:

Traditionally, in IaaS platforms the development effort was distributed in managing cost, latency,

scalability and elasticity but with serverless platforms, as it is based on the concept of microservices

that is fine-grained services so while making module based application, the developer’s effort to be

concentrated only on cost involved on their code that is on their business logic which can also

involve composibility of functions in order to implement the specific application behavior rather than

other factors.

6.2. Consumers:

For consumers like developers it proves to be advantageous as developers need not to worry about

the deployment and management of servers, virtual machines or containers that considered to form

key component of distributed services which otherwise was not possible with existing paradigm.

6.3. Providers:

For providers, the stateless programming model used in serverless paradigms provides more control

in terms of software stack and also enables transparency in delivering security patches and hence

optimizes the platform. Here, providers perform the process of scaling on every request or event

occurs and you need not to worry about the increasing number of concurrent request or memory size

in FaaS architecture, unlike in non FaaS architecture, where auto-scaling still needs proper set up and

maintenance.

Page 9: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 587

6.4. Reduced cost:

Serverless approach frees you to manage servers, and business logic by just paying for the defined

services like you are paying someone to manage your business operations in an effective manner.

Although initially it seems that there must be an increase in expenses. But in reality the incurred cost

is quite less as the same person or authority also managing the resources for some other person or

organization as well, therefore economy of scale effect is there. Moreover, it involves two aspects-

infrastructure costs and operational/ development costs. There is reduction in infrastructure cost

when the infrastructure is shared with other users such as hardware resources. Similarly, there is

reduction in operational costs as you have outsourced serverless system and hence you need to spend

less time in comparison to the system developed and managed by you. Both these benefits can be

gained from existing Information as a Service (IaaS) or Platform as a Service (PaaS) , but it can be

extended in two ways- First one is Backened as a Service (BaaS) and other is Function as a Service

(FaaS).

In BaaS, the components which are required to make up an entire application can be commoditize

that is in business terms, it turns out to a application or service into money-making, unlike IaaS and

PaaS where the servers and management of operating system can be commoditized. For example, in

the process of authentication managed by most of the applications, login is successful by entering

valid username and password, etc. For this we have Auth0 service [15], whose sole purpose is to

provide integration of ready to use authentication facility incorporated into our application.

In FaaS, the major benefit is to pay only for the enumeration that you need. For example, initially

when you (companies or teams) are trying something new then involved operational cost is quite

less. Moreover, ‘free tier’ facility is also provided by some FaaS vendors in case the amount of work

is small. For increasing the speed of your application, performance optimizations can be made to the

code which directly affects the operational costs. It gets reduced as per the rules of vendor’s charging

scheme. Containers which is quite popular technology these days provides abstraction to individual

applications from OS-level deployment and there are available container platforms that are hosted on

clouds such as Amazon ECS and Google Container Engine[9] which frees the developer to manage

their server systems like Serverless FaaS. Still, container platforms lacks in managing scaling in a

transparent manner like Serverless FaaS [9]. Similarly with PaaS also, even if you manage to do

auto-scaling still you would not be able to manage to the level of individual requests [9].

6.5. Time to market:

With the instant changing market trends, in order to maintain our own existence we need to always

try something new and if succeed then try to incorporate into our existing systems. In case of FaaS,

initially when the idea comes till its deployment in some cases, outcome is by far most promising

and mostly suitable to simple functions and small scale implementations.

6.6. Greener Computing:

In today’s era of information technology, the amount of data generated increased manifold that leads

to creation of more and more data centers in the world, and hence the energy usage has also

increased quite significantly. The big giants like Apple, Google and others have a talk on

emphasizing the use of renewable energy sources for hosting data centers and hence reducing the

fossil-fuel burning impact of such sites. Still there are number of servers that are idle but using power

resources which is the main concern.

“ Typical servers in business and enterprise data centers deliver between 5 and 15 percent of their

maximum computing output on average over the course of the year. --Forbes “ [9]. So the issue is to

adequately manage the capacity of available data centers. Although, technology is advanced we are

using IaaS or PaaS solution but still making decisions ourselves about the applications. But with the

Page 10: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 588

serverless approach, it is the responsibility of serverless vendors to enumerate the capacity decisions

for our application. By following this approach, environmental impact can be reduced to and

resources can be efficiently used.

VII. INHERENT DRAWBACKS OF SERVERLESS COMPUTING

Following are the obstacles that need to be addressed in order to make this emerging computing

technology a big success in all aspects [9].

7.1. Vendor: In serverless architecture, there is proper vendor control as you are under outsourcing strategy. The

smarter the vendor service, the more control they can impose. Vendors support multitenancy to

achieve economy of scale in which different users able to run multiple instance of software on the

same system and problem occurs when failure in software of one user affects the other user or when

one user able to see the data of other. This can also be the scenario in Serverless vendors that their

ways of implementations are totally different from each other. So if you want to change the vendor at

some later time, you need to go through many hardships like updating operational tools, change in

code, design or even architecture.

7.2. Full BaaS architecture:

In this architecture, logic is written on the client side and not on the server side. This is fascinating

for the first client platform but you need to repeat the implementation of logic for the next platform.

Even if you migrate to a new database, you need to repeat that coding change among all your

different clients. Moreover, there is loss of server optimization with ‘full BaaS’ architecture like in

case of mobile apps where ‘Backend For Frontend’ approach followed in which abstraction of some

underlying aspects within the server exists, partly which leads to performing operations rapidly and

battery power usage is also less on the client side. This approach is not available with ‘full BaaS’

architecture.

7.3. In-server state for Serverless FaaS:

There is no in-server state for serverless FaaS. Managing of state in serverless platforms can be done

with these options like database to maintain state, external file store like S3 (Simple Storage

Service), out-of-process cache like redis. Following any of these options leads to slower

implementation in comparison to in-memory persistence. Another issue to deal with is, in-memory

caches, use external cache like redis or memcached rather than assuming in-process cache when

cache is warmed-up on its use, it might be the case that it is thrown away due to the torning of

instance of FaaS.

VIII. CHALLENGES OF SERVERLESS PARADIGM

These are some of the predictable key challenges faced by serverless platforms these days, but that

can be overcome with the passage of time [3][9][15].

8.1. Gap of management and scaling :

In order to make a choice between Serverless FaaS and hosted containers in terms of management

and scaling, it goes narrow with the style and type of application is using. It might be the case that

FaaS overpowers the containers in event driven styling and one application component comprise few

event types while containers in synchronous-request driven having many entry points.

8.2. Scaling to zero:

Scaling to zero refers to the situation when serverless code is not executing, therefore users need not

to pay for that time. But it leads to ‘cold start’ problem in which charges incurred in the form of

Page 11: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 589

penalty to make serverless code in ready to run state. In order to find the mechanisms that lead to

minimize the problem of ‘cold start’ is quite a challenging task.

8.3. Cost:

It is still one of the major challenges in two perspectives. One is to code a serverless function in such

a manner that resource usage is minimized whether it is executing or not. Another, in the current

scenario serverless functions proves to be economical for computations that are CPU-based rather

than I/O. I/O based computations may be more economical on technologies like Virtual Machines

and containers when it is compared with existing cloud computing approach.

8.4. Limitations on Resources:

Limitations on resources are required for the smooth functioning of serverless function code so that

the platform can handle the variation in load efficiently. Resource limits like time to execute,

available memory size, bandwidth and usage of CPU time. Enforcing of resource limits can also be

done in aggregate manner so as to apply on more than one function or across the complete platform.

8.5. Security Concerns:

Security always comes out to be a critical challenge as in serverless platforms where serverless

functions are executing from many users. It becomes necessary to provide isolation between the

users.

8.6. Scalability:

Scaling is quite a crucial issue as it is responsible for handling the peak loads as well as handling that

situation when there is not even a single request very efficiently.It becomes more challenging as in

serverless these decisions yet have to be taken with insignificant application level knowledge.

8.7. Tools:

Traditional tools are no more applicable to monitor the servers. New approaches in the form of tools

are required in serverless platforms. Even the applications used for debugging earlier are also need to

be changed. Moreover, in order to coordinate a number of functions, tools are required to create and

maintain integration of functions.

8.8. Shorter time:

It is difficult to identify the issues and bottlenecks in an application as serverless functions run for a

short span of time. Moreover, some applications require long running of application logic. The long

running logic can be further decomposed into small logics and then again can be tracked back as

single unit available programming models and tools may achieve.

8.9. Stateless serverless functions:

It is vague in serverless platforms that how to handle state in stateless serverless functions.

Therefore, tools, programming models , etc are needed to provide abstraction up to required levels.

8.10. Legacy Code:

Legacy code refers to the existing code that involves invaluable efforts and countless days. So, it is

one of the biggest challenges to what extent the existing legacy code can run in a new serverless

environment by automatically or by decomposing into small units in order to take benefit of this

emerging paradigm.

Page 12: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 590

IX. CONCLUSION

The emergence of serverless platforms is due to the immense speed in the data flow which is taken

care by the data centers all over the world. Developers choose to adopt this emerging technology as it

provides you the required managed services for day-to-day development. It reduces the operational

complexity, cost-effective solution, provides scaling in no time. Moreover, the upcoming platforms

like AWS Lambda, Microsoft Azure, Google Cloud, IBM Bluemix Open Whisk, Iron.io Iron

Functions, Auth 0 Web Task and Galactic Fog Gestal Laser takes this technology to a new level by

providing different scale today. Even Open source platforms are also emerging in this direction like

OpenLambda for creating next generation web services [11]. It’s use cases in web apps, chatbots,

webhook based systems (rely on third party) monitoring and debugging, real time streaming data

processes, big data, and many more. It is also facing many challenges like getting state of a stateless

function, is it suitable for coarse-grained services like fine-grained and many more questions are still

under survey. The challenges can definitely be overcome some day with more and more research in

this direction and then this “serverless” term will reach to its true potential.

REFERENCES I. GOJKO ADZIC, R. C. (2017). SERVERLESS COMPUTING: ECONOMIC AND ARCHITECTURAL IMPACT. ESEC/FSE

2017 PROCEEDINGS OF THE 2017 11TH JOINT MEETING ON FOUNDATIONS OF SOFTWARE ENGINEERING (PP. 884-

889). PADERBORN: ACM DIGITAL LIBRARY.

II. IBM. (n.d.). IBM Cloud Functions. Retrieved April 12, 2018, from www.ibm.com:

https://www.ibm.com/cloud/functions

III. Ioana Baldni, P. C. (2017, June 10). Serverless Computing: Current Trends and Open Problems. 1-19. Cornell

University Library.

IV. Matt Crane, J. L. (2017). An Exploration of Serverless Architecxtures for Information Retrieval. ICTIR '17

Proceedings of the ACM SIGIR International Conference on Theory of Information Retrieval (pp. 241-244).

Amsterdam: ACM DIGITAL LIBRARY.

V. McGrath, G. (2017). SERVERLESS COMPUTING: APPLICATIONS, IMPLEMENTATION AND

PERFORMANCE. 1-41. Notre Dame, Indiana, US.

VI. Nunns, J. (2017, December 21). Everything you need to know about Serverless Computing. Retrieved April 27,

2018, from www.cbronline.com: https://www.cbronline.com/in-depth/serverless-computing

VII. Omar Alqaryouti, N. S. (2018). Serverless Computing and Scheduling Tasks on Cloud: A review. American

Scientific Journal for Engineering, Technology, and Sciences (ASRJETS) , 40 (1), 235-247.

VIII. Optimizing Enterprise Economics with Serverless Architectures. (2017, October). 1-21. Amazon Web Services.

IX. Serverless Architectures. 1-27. New York: Mike Roberts.

X. Schott, W. (2016, October 05). Five advantages of serverless architecture for every app. Retrieved April 27,

2018, from www.developer-tech.com: https://www.developer-tech.com/news/2016/oct/05/five-advantages-

serverless-architecture-every-app/

XI. Scott Hendrickson, S. S.-D.-D. (2016). Serverless computation with openLambda. HotCloud'16 Proceedings of

the 8th USENIX Conference on Hot Topics in Cloud Computing (pp. 33-39). Denver: ACM DIGITAL

LIBRARY.

XII. Services, A. W. (n.d.). AWS Lambda. Retrieved April 15, 2018, from docs.aws.amazon.com:

https://docs.aws.amazon.com/lambda/latest/dg/welcome.html

XIII. Services, A. W. (n.d.). AWS Lambda FAQs. Retrieved April 15, 2018, from aws.amazon.com:

https://aws.amazon.com/lambda/faqs/

XIV. Services, A. W. (n.d.). AWS Step Functions. Retrieved April 15, 2018, from aws.amazon.com:

https://aws.amazon.com/step-functions/

XV. Theo Lynn, P. R. (2017). A Preliminary Review of Enterprise Serverless Cloud Computing (Function-as-a-

Service) Platforms. IEEE 9th International Conference on Cloud Computing Technology and Science

(CloudCom 2017) (pp. 162-169). Hong Kong: IEEE Xplore Digital Library.

XVI. Tran, T. H. (2017). DEVELOPING WEB SERVICES WITH SERVERLESS ARCHITECTURE. 1-78.

Lappeenranta University of Technology School of Business and Management.

XVII. Vatche Ishakian, V. M. (2018, February 9). Serving deep learning models in a serverless platform .

XVIII. Watts, S. (2018, January 17). What is Serverless Architecture? Serverless Architecture Explained. Retrieved

April 27, 2018, from www.bmc.com: https://www.bmc.com/blogs/serverless-architecture/

XIX. Azure, M. (n.d.). Azure Functions Documentation. Retrieved April 15, 2018, from docs.microsoft.com:

https://docs.microsoft.com/en-us/azure/azure-functions/

Page 13: Going Serverless: A Review - IJRTER · For example, Amazon Web Services have their own API Gateway. The use case for API Gateway and FaaS lies in creating http-fronted microservices

International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 04; April- 2018 [ISSN: 2455-1457]

@IJRTER-2018, All Rights Reserved 591

XX. Azure, M. (n.d.). Functions. Retrieved April 18, 2018, from azure.microsoft.com:

https://azure.microsoft.com/en-us/services/functions/

XXI. Azure, M. (n.d.). Functions pricing. Retrieved April 15, 2018, from azure.microsoft.com:

https://azure.microsoft.com/en-us/pricing/details/functions/

XXII. Cloud, G. (n.d.). CLOUD FUNCTIONS BETA. Retrieved April 15, 2018, from cloud.google.com:

https://cloud.google.com/functions/?hl=fr

XXIII. Cloud, G. (n.d.). Google Cloud Functions BETA. Retrieved April 15, 2018, from cloud.google.com:

https://cloud.google.com/functions/

XXIV. Cloud, G. (n.d.). Google Cloud Functions Documentation. Retrieved April 15, 2018, from cloud.google.com:

https://cloud.google.com/functions/docs/

XXV. Cloud, G. (n.d.). Quotas. Retrieved April 15, 2018, from cloud.google.com:

https://cloud.google.com/functions/quotas

XXVI. Company, A. A. (2018). AWS Lambda. Retrieved April 15, 2018, from aws.amazon.com:

https://aws.amazon.com/lambda/

XXVII. Docs, I. C. (n.d.). Creating and invoking Actions. Retrieved April 15, 2018, from console.bluemix.net:

https://console.bluemix.net/docs/openwhisk/openwhisk_actions.html#openwhisk_actions

XXVIII. Docs, I. C. (n.d.). System details and Limits. Retrieved April 12, 2018, from console.bluemix.net:

https://console.bluemix.net/docs/openwhisk/openwhisk_reference.html#openwhisk_reference

XXIX. Functions, I. C. (n.d.). Monitor activity with Dashboard. Retrieved April 12, 2018, from console.bluemix.net:

https://console.bluemix.net/docs/openwhisk/openwhisk_monitoring.html#monitoring-your-openwhiskactivity-

with-the-openwhisk-dashboard

XXX. Gandhi, V. (2018, March 03). A quick guide to serverless computing platforms. Retrieved April 10, 2018, from

www.nagarro.com: https://www.nagarro.com/en/perspectives/a-quick-guide-to-serverless-computing-platforms

XXXI. Geoffrey C. Fox, V. I. (2017, June 5). WHITEPAPER. First International Workshop on Serverless Computing

(WoSC) , 1-22.

XXXII. GitHub. (n.d.). Azure/azure-functions-host. Retrieved April 12, 2018, from github.com:

https://github.com/Azure/azure-functions-host

XXXIII. SPOIALA, C. (n.d.). Cloud offering: Comparison between IaaS, PaaS, SaaS, BaaS. Retrieved April 12, 2018,

from assist-software.net: https://assist-software.net/blog/cloud-offering-comparison-between-iaas-paas-saas-

baas

XXXIV. serverless. (n.d.). OpenWhisk-Credentials. Retrieved April 12, 2018, from serverless.com:

https://serverless.com/framework/docs/providers/openwhisk/guide/credentials/