Step-by-Step Guide to Deploy a Microservice in GKE
Introduction
After setting up your Kubernetes cluster in Google Kubernetes Engine (GKE), the next step is to deploy microservice applications in GKE. This blog will guide you through deploying an Nginx-based microservice in your GKE cluster using Cloud Shell commands.
Step-by-Step Guide to Deploy a Microservice Applications in GKE
If you haven’t created a GKE cluster yet, please refer to my previous blog for step-by-step instructions.. Now, let’s get started with deploying your microservice.
1. Access Your GKE Cluster to deploy a Microservice Applications in GKE
- Navigate to the Google Kubernetes Engine dashboard.
- Locate your cluster (e.g.,
my-cluster
) and click the CONNECT button. - Next, click “RUN IN CLOUD SHELL” to open the Cloud Shell console.
![Access GKE cluster](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/connect-shell-1.png?resize=1024%2C238&ssl=1)
![Run in cloud shell](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/connect-shell-2.png?resize=1024%2C561&ssl=1)
2. Connect to Your Cluster
- Copy and paste the connection command from the GKE dashboard into the Cloud Shell.
- After running the command, you will see:
kubeconfig entry generated for my-cluster.
- The cluster configuration file is stored at:
~/.kube/config
![](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/connect-shell-3.png?resize=1024%2C51&ssl=1)
![](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/connect-shell-4.png?resize=872%2C41&ssl=1)
3. Deploy a Microservice Applications in GKE
Now that you are connected to your cluster, you can deploy an Nginx-based microservice. Follow the steps below:
Step 1: Create a Deployment
A deployment ensures that your application runs as expected. Execute the following command to create an Nginx deployment:
kubectl create deployment nginx --image=nginx
![create deployment](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/create-deploymnent-1.png?resize=1024%2C47&ssl=1)
To check if the pods are running, execute:
kubectl get pods
![Deploy Microservice in GKE](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/create-deploymnent-2.png?resize=730%2C61&ssl=1)
Alternatively, go to the Workloads section in the GCP console and verify the pod status.
- Navigate to Workloads in Kubernetes Engine to see the pod status.
![Deploy Microservice in GKE](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/create-deploymnent-3.png?resize=1024%2C381&ssl=1)
4. Expose the Deployment
To make the microservice accessible, you need to expose the deployment as a service:
Run the following command to expose it:
kubectl expose deployment nginx --type=LoadBalancer --port=80
![](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/create-deploymnent-4.png?resize=1024%2C39&ssl=1)
To verify the service status, use:
kubectl get service
![](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/service-2.png?resize=837%2C80&ssl=1)
5. Test the Microservice
Once the service is running, you can check if the microservice is accessible using the curl command:
curl <EXTERNAL-IP>
![Deploy Microservice in GKE](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/curl-1.png?resize=885%2C196&ssl=1)
Replace <EXTERNAL-IP>
with the external IP address of your service, which you can find in the service details.
6. Set Up Horizontal Autoscaling
To handle increased traffic, enable autoscaling for your deployment. Use the following command:
kubectl autoscale deployment nginx --max=2 --cpu-percent=70
![autoscale](https://i0.wp.com/techwithhuz.com/wp-content/uploads/2025/01/auto-scale-1.png?resize=1024%2C33&ssl=1)
This will create an autoscaler that ensures the CPU usage doesn’t exceed 70% and scales up to two pods if needed.
Conclusion
In this guide, we successfully deployed an Nginx-based microservice to a GKE cluster. Additionally, we exposed the deployment, verified its accessibility, and configured horizontal autoscaling to handle traffic spikes.
In the next blog, we will explore advanced features and best practices for using Google Kubernetes Engine (GKE). Stay tuned for more insights!