close
play-icon
Arrow icon
Getting Started with Google Kubernetes Engine
Pencil icon
Alan Leal
Calendar icon
2019-05-23

By now, you’re probably aware of what containerization is and how you can use Kubernetes to manage your deployment. But you will have come across several Kubernetes builds, each one from a major cloud provider such as Google, Amazon, and Microsoft.

In this article, we’re going to talk about Google Kubernetes Engine (GKE), which is essentially Google Cloud’s Kubernetes deployment build.

What is Google Kubernetes Engine?

The starting point to understanding GKE is looking back at the fact that it was Google that had originally built Kubernetes. For over a decade now, Google has been running on Kubernetes.

Why Choose Kubernetes Google Cloud Over Amazon & Microsoft?

Google opened up Kubernetes to the public in 2014. While the software is now open source, Google has the most mature understanding of the system and its direct benefits. Although the systems from Amazon (EKS) and Microsoft (AKS) are fundamentally the same, the Google Kubernetes Engine (GKE) is often considered the best optimized for Kubernetes.

Does Google Kubernetes Engine Stand out in Terms of Features?

Like Amazon’s EKS and Micrososft’s AKS, GKE is a managed service for Kubernetes instances. The terminology may differ, but with each, you’ll get node pools, automatic upgrades, clusters, the ability to implement microservices, and more. To be fair, you can’t go wrong with any one of them. However GKE does have a few cutting edge features that allows it to stand out from the rest.

GKE On-Prem

As one example, GKE will get a feature in the near future that will let it stand out from the pack, i.e., the ability to operate from a private data center.

In order to use EKS, AKS and, today at least, GKE; you must deploy your applications to a public cloud service such as AWS, Azure, or Google Cloud Platform.

However, once GKE On-Prem graduates from beta testing, you will be able to use GKE in any cloud environment, including your on-premises data center.

For industries where moving to the public cloud is considered too risky for data security, such as banking, GKE On-Prem will be a way to leverage the benefits of Kubernetes while keeping your data within your own walls.

Istio

You can also use ISTIO in GKE. ISTIO is a service mesh that helps you manage microservices in various ways at zero additional cost.

Istio offers a capability that’s similar to the Netflix framework Hystrix. Similar to Hystrix, Istio is a circuit-breaker technology that prevents one microservice from causing problems to the others.

Let’s say one of your services is experiencing an excessive amount of demand. Istio will stop that service from affecting the other services, which protects your application from crashing.

Struggling to Deploy Kubernetes?

We’ll get your applications market-ready on Kubernetes Within Weeks

Contact Us

Getting Started with Google Cloud Kubernetes

If you’ve decided to proceed with GKE, then you should have the following skill sets on your team:

Basic Understanding of Kubernetes

This is tougher than it seems. To build an understanding of Kubernetes, your development team must also have a good grasp of Docker, containers, images, registries, etc.

Knowledge of Linux

GKE is in Linux, so your team must understand Linux, Shell Scripting, Bash and other skills.

Infrastructure Automation

Your team must understand infrastructure automation, including automation tools such as Terraform or Google’s Infrastructure as Code – a cloud-agnostic infrastructure automation framework.

Knowledge of GCP

Today, the Google Kubernetes Engine (GKE) runs in the Google Cloud Platform (GCP), so you should know how to deploy and manage in GCP.

Overall, if you have gaps in one or several of these areas, then it will be very difficult to start with GKE. These challenges are compounded if your project is already on a compressed timeline.

These gaps are more common than people think. In some IT environments, you can’t even run Docker and containers in your local environment due to security and other issues. Many aren’t flexible enough to experiment with larger cloud deployments.

How to Build Capacity for Google Cloud, Kubernetes & Linux

Internal Training

You can leverage a wide range of tools to build your internal capacity for GKE, GCP, Containers, Docker, Linux, automation frameworks, etc.

You could consult Google’s own training assets for the cloud, which also provides tutorials for Kubernetes and GKE. You should also look up Coursera’s GKE course and Linux Academy.

However, the challenge with internal training is that it not only takes time, but it requires buy-in from each of your key stakeholders.

For example, let’s say your development group wants to learn Docker, but your infrastructure team — which manages the provision of servers for the development group — isn’t on board. In this case, your Docker project is going to collapse very quickly.

You need a high-level executive to enforce the push, but even then, you could be looking at a turnaround time of at least 6 months before they’re ready for the work.

But then you will need to wait for another period, usually around 6 months, to implement the project. That’s a period of at least one year from training to delivery.

Work with a Software Development Agency

You can accelerate the development process as well as train and upskill your own team by working with an outside software development agency. If doing it alone takes one year, you could easily halve that — or more — with the help of an outside partner.

At Techolution, we help our clients deliver usable, market-ready software products within weeks of project start. We augment this with workshops and other upskill programs meant to empower your software team to manage your application(s). Let’s start today!

Did you enjoy the read?
Share
ArrowPrevious
NextArrow