TNS
VOXPOP
You’re most productive when…
A recent TNS post discussed the factors that make developers productive. You code best when:
The work is interesting to me.
0%
I get lots of uninterrupted work time.
0%
I am well-supported by a good toolset.
0%
I understand the entire code base.
0%
All of the above.
0%
I am equally productive all the time.
0%
Serverless

Serverless 101: How to Get Serverless Started in the Enterprise

Jun 4th, 2018 10:19am by
Featued image for: Serverless 101: How to Get Serverless Started in the Enterprise
Feature image via Pixabay.
Editor’s Note: This post is the first in a multipart series exploring the basics of serverless. Check back each Monday for additional installments.

In the beginning, there was bare metal, and it was good.

Single-tenant servers were fast, reliable and secure — beholden only to their master. Verily, though, also cumbersome to provision and scale. The need for agility and scalability begat VMs, and cloud providers brought unto us infrastructure as a service (IaaS), and lo self-service in the cloud was born. Upon this fertile land arose Amazon Web Services, orchestration, and infrastructure as code (IaC); then also containerization came to pass, which begat platform as a service (PaaS) architecture. And lo all was well upon the land … Except for developers crying forth in want of language agnostic endpoints, horizontal scalability and the ability to pay for the real-time consumption of services.

In response to their pleas, at last, a great gift was bestowed upon the world: serverless computing, also often known as function as a service (FaaS). Runtimes which execute applications but do not store data. Meaning, a cloud provider like AWS, Google Cloud or Microsoft Azure dynamically manages the assignment and distribution of resources.

Serverless is pay-as-you-go, based on actual consumption rather than pre-purchased services based on guesswork. This is infrastructure as it was meant to be, emerging right before our eyes in 2018.

It’s Not Moonbeams

First: The name is totally misleading. Serverless computing still requires servers. There aren’t, like, secret magical moonbeams powering everything.

The term arose because the server management and capacity planning decisions are completely hidden. Serverless is “serverless” in terms of the user/developer never needing to take care of, or even be aware of, any of the infrastructure — the servers are fully abstracted away. Serverless code can be used alongside code deployed in traditional styles, such as microservices — or, applications can be written to be purely serverless and use no provisioned servers at all.

The true value of serverless is not cost efficiency, but time efficiency.

“I like to think of it has a mini-PaaS for glue-like software,” explained Sebastien Goasguen, currently senior director of cloud technologies at Bitnami. “The real kicker in serverless comes from being able to call the functions — i.e., the glue — from an event that happens in the cloud.” For example, Goasguen described the scenario of putting an image into a storage bucket on AWS and then calling a function to resize that image. A serverless system takes that code and automatically injects it into a runtime environment (server or container), and then exposes it so that the function can be called.

Well, What IS Serverless, Then?

The major difference between traditional cloud computing and serverless computing is that you — the customer needing said computing — don’t pay for unused, or even underutilized, resources. Previously, we had to anticipate capacity and resource requirements and pre-provision for them, whether on in-house data center or in the cloud. In our previous example, however, this would mean spinning up a server in AWS to stand by to execute this image resizing service at any time. In a serverless setup, however, you’re just spinning up some code execution time when, and only when, the function is called.

The serverless computing service takes your functions as input, performs logic, returns your output, and then shuts down. You are only billed for the resources used during the actual execution of those functions.

Pay-as-you-play, and only for resources actually consumed, is obviously a great thing. However, Goasguen and other cloud-native pros stress that the true value of serverless is not cost efficiency, but time efficiency.

Kind of Like a Time Machine?

Well, yeah, it kind of is. Or maybe more like a portal to the future, in that serverless — like other as-a-service technologies — is one of those tools which allow companies to focus on building apps that make use of bleeding-edge technologies like AI and machine learning… Rather than divert effort into forever building and then re-building the infrastructure layers necessary to keep up.

Serverless’ other time-machine power lies in shortening the time from code development to putting that code in production. It really is “here is my code, now run it” — with almost no infrastructural drag in between.

“The basic idea is a developer just writes code and pushes to a serverless service. That’s it. The rest is handled by the service, “ said Chad Arimura, vice president of serverless at Oracle. Better yet, he added, dependencies like database and storage are also services folded in seamlessly under the serverless hood.

“Behind the scenes, specialized teams combined with a lot of automation are operating these systems at scale so that the developer doesn’t have to think about this stuff,” said Arimura. “It does kind of look and feel like magic and moonbeams, which is why the hype cycle is strong with serverless. Because it’s such a better experience.”

Feel the FaaS Platform Power

While Docker has simplified the packaging and dependency handling for distributed application, and Kubernetes is helping enterprise run these apps in production, they still are not yet simple or easy to use. “Dockerfile, infrastructure details, Kubernetes manifests — all of these are still too complicated for a developer-minded audience,” said Bitnami’s Goasguen.

At its core, then, serverless computing works as a Function as a Service platform. Essentially, that’s what AWS Lambda or Google Cloud Function, handling resource management, load balancing and multithreading while the devs get to just focus on their code, and enterprise orgs on their mission.

Yaron Haviv, founder and CTO of iguaz.io, a continuous data platform expressly designed to optimize performance in the cloud, walked The New Stack through the FaaS workflow:

  1. Serverless platforms take functional code — the “function” part of FaaS — plus all its dependencies (required libraries, amount of memory, properties, etc.), and build a containerized application package, usually in the form of a Docker image.
  2. When another platform service, such as an object storage or database wants to trigger the function, or when there is an external HTTP request destined for that function, the serverless platform will direct it to one of the available function microservices. If there are no active microservices available, it will deploy — “cold start” such an instance.
  3. The serverless platform takes care of recovering micro-services upon failure, auto-scaling to fit demand, logging and monitoring function activity and conducting a live rolling upgrade when code is modified. Someone else manages the platform services so that developers can just focus on the “function” aspect.

Are There Serverless Downsides?

For all the benefits, there are some potential FaaS downsides as well. For one, experts say, cloud providers will typically spin down runtime environments that aren’t seeing much use — meaning paradoxically, they also limit the total amount of resources available to you, introducing latency and problems with high-performance. Monitoring, debugging, and security can also tricky with these cloud providers — as they would be with any cloud computing workflow — due to the fact that it all, well, runs in a public cloud that you don’t have access to or control of.

Amazon Lambda has become synonymous with serverless computing, and that is a model most people can wrap their heads around. Thus, while Lamba has blazed the serverless trail, its drawbacks have come to be perceived as the drawbacks of serverless overall:  slow cold start, slow performance, short-lived functions, and a closed set of triggers. These limitations are now assumed to universally apply to all serverless platforms, when in fact those are implementation choices. It is important to note that there are newer serverless platforms, such as Nuclio, which evolved to serve a broader set of use cases. These platforms have fewer restrictions, provide high performance, and can run in multiple clouds or even on premises.

“The geeks love it, but enterprises are still testing the water — they haven’t even gotten used to Docker/K8s, and now there is serverless,” said Haviv. “Like any new technology, it needs to ‘cross the chasm’ from the early adopters — who are typically more agile, more willing to take risks — to the masses. Who need to build trust, see proof points, address concerns like performance, security…”

Obviously, given that serverless tech is emerging and evolving practically day by day, not all aspects can be comfortably known entities. While not necessarily a downside, this factor definitely makes entire boards of directors squirm in their Aeron chairs. “Enterprises simply have to become more agile and take risks, since the digital “transformation” is taking casualties in the form of incumbents who are disrupted by innovators, and Serverless (like other As A Service technologies),” Haviv pointed out.

“The interesting part is that by abstracting away much of the Docker/Kubernetes complexity, serverless can be adopted faster and more easily, even though it’s the newer tech.”

How to Know if Serverless Is Right for Your Company?

It’s not about a company being a good or bad fit with serverless, say those leading the way.

“Literally every company and organization that writes any software is a good fit for serverless,” said Oracle’s Arimura. “That said, current culture and distance on the journey to ‘cloud native’ can make it tougher to adopt serverless.” In other words, if a company has zero public cloud usage, and zero experimentation internally with new projects such as Kubernetes and Docker, then serverless is not where they should be starting.

“It’s a new architecture and requires thinking differently. The easiest example is if a monolith becomes 10 microservices becomes 100 functions with independent deploy cycles and complex dependency graphs. This is where mature and robust CI/CD and automation systems come in,” said Arimura.  When paired with serverless, agility and innovation can skyrocket, but one without the other can do more harm than good, he continued.

“Which is why DevOps doesn’t just become ‘NoOps’ — that is entirely the wrong way to think about it,” said Arimura. “In fact, serverless makes DevOps even more important than ever.”

In fact, noted Bitnami’s Goasguen, most of the companies he has observed adopting serverless — AWS Lamba especially — are developer-focused organizations that were already on AWS and used it to link services together. “So chances are that, if you are not AWS today, you don’t need serverless,” said Goasguen. “However you should still keep it on your radar, start evaluating, identify event sources in your enterprise and see how they could be used to build complete application pipelines.”

How to Best Dip an Enterprise Toe into Serverless Water?

“There is no need to take a monolith and completely convert it to microservice and/or functions,” advised Arimura. “Nor is there need to take a company’s most important project and use that  to learn a new architecture — especially if culture is still adapting to DevOps.”

Start small, he advised. Perhaps a few automation tasks, or some marketing things, or event-driven use cases.

The New Stack’s serverless series is here to get your company started.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Docker, Kubernetes.
TNS DAILY NEWSLETTER Receive a free roundup of the most recent TNS articles in your inbox each day.