Friday, May 31, 2024

Mastering Go: Part 13 - Messaging with Apache Kafka(Setup)

 Messaging

Effective communication between software components, modules, and applications is essential for building robust and efficient systems. This communication allows data exchange and interaction between various parts. There are several methods and needs for communication, both internally within a system and externally with other systems.

Messaging is a powerful approach that decouples components and systems from each other. This decoupling promotes scalability and maintainability. The concept of messaging itself has evolved from the need for asynchronous communication, where components can exchange data without needing to be available at the same time.

Asynchronous Communication

Asynchronous communication allows applications to exchange data without needing to be aware of each other's availability. This decoupled approach offers several benefits for building scalable and maintainable systems.

There are various methods and techniques for asynchronous communication between applications or services. A popular method involves using topics. In this approach, a publisher sends a message to a topic, and subscribers interested in that topic receive the message. This mechanism ensures that the publisher and subscriber don't necessarily need to be aware of each other's existence.

There are several messaging systems, in this post, we will be learning mostly the basics of Apache Kafka.

Pub/Sub

Pub/Sub is an asynchronous messaging service commonly used to broadcast messages. To use Google Cloud Pub/Sub, we must have access to Google Cloud services. Golang has a built-in PubSub package provides an easy way to publish and receive Pub/Sub messages. With this package, we can create a topic to publish messages and create a subscriber to receive notifications.

We can achieve similar functionality offered by Pub/Sub using open-source platforms like Apache Kafka and/or RabbitMQ. For more details on Pub/Sub, please refer to the documentation provided by Google Cloud. Since we are not using Google Cloud services in this learning series, we will not go deeply into Pub/Sub in this post.

For more information, visit:


Apache Kafka


Apache Kafka is a messaging and distributed streaming platform that facilitates publish/subscribe communication between applications and services. It is one of the most popular open-source alternatives for messaging platforms.

Kafka operates as a distributed messaging system, utilizing servers and clients to communicate between applications using the TCP protocol. Here are the key components and concepts of Kafka:

Kafka Concepts

  • Server: Kafka runs as a cluster on multiple servers. The Kafka cluster is scalable and fault-tolerant, meaning if one server fails, the other servers will take over to ensure continuous operation.
  • Client: Kafka clients allow applications to read messages from Kafka topics/servers and process them. Kafka clients are available in several programming languages including Go.
  • Topics: The topic is the message category(named) where messages/events will be published, subscribed, and organized.
  • Producer: The client application that publishes events/messages to Kafka topics
  • Consumer: The client that subscribes and consumes the message/events from Kafka topics

Setting Up Kafka

To use Kafka, you need to set up the Kafka environment, which can be quite complex. You need a cluster management system or platform(Coordination service) to run Kafka clusters. The options include:

  • ZooKeeper: Used for managing Kafka clusters in earlier versions. It handles tasks like broker election, configuration management, and partition assignment.
  • Kraft: A newer method that eliminates the need for ZooKeeper. Kraft is a built-in alternative that eliminates the need for a separate ZooKeeper service. It replicates metadata across Kafka brokers for fault tolerance. 
  • Docker: Docker simplifies deployment and management by containerizing Kafka and its dependencies. Docker Compose is a helpful tool for defining and running multi-container applications, including Kafka clusters.
  • Kubernetes: Kubernetes is a container orchestration tool that can be used to manage and scale the Kafka cluster.

Exercise: Setting Up Apache Kafka

This exercise will guide you through setting up Apache Kafka and creating a message using Docker. 

Step 1: Install Docker

Option 1: Using Homebrew (macOS)

$brew install docker

Option 2: Using Docker Desktop

  1. Download the Docker Desktop installer from https://www.docker.com/products/docker-desktop/.
  2. Run the installer and follow the on-screen instructions.

Step 2: Verify Docker Installation

Open a terminal and run:

$docker -v

This command should display information confirming Docker is installed and running.

Step 3: Docker Compose

Docker Compose is a tool for defining and running multi-container applications. It's often bundled with Docker installations. Docker Compose allows you to specify the applications and their configurations in a single file (usually named docker-compose.yml).

Step 4: Start Docker

Verify Docker Status

$docker info

This command will list detailed information about the Docker environment.

List Docker Images

$docker image ls

This command will display a list of Docker images currently available on your machine.

Step 5: Install Kafka

We'll use Docker containers to set up Kafka. Apache Kafka versions 3.7 and later offer an official Docker image. This simplifies the local setup process.

Step 6: Create a Local Kafka Environment

To create a local Kafka environment, we'll utilize a Docker Compose configuration file (docker-compose.yml). This file defines the containers needed for the Kafka cluster to run on your local machine. I'm using the project(learngo) from this series to add my compose file( Part 12 - Program debugging , Profiling and Performance Evaluation)

docker-compose.yml

version: '3.8'


services:

  zookeeper:

    image: wurstmeister/zookeeper

    ports:

      - "2181:2181"


  kafka:

    image: wurstmeister/kafka

    ports:

      - "9092:9092"

    environment:

      KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092

      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092

      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT

      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT

      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181

    depends_on:

      - zookeeper



Step 7: Start the Kafka Cluster

Run the following command to download the Kafka image defined in your docker-compose.yml file and start the Kafka container in the background:

$docker-compose up -d

S*******s-MacBook-Pro:learngo s**********i$ docker compose up -d

WARN[0000] /Users/s*************i/Documents/learngo/docker-compose.yml: `version` is obsolete 

[+] Running 8/8

  kafka Pulled                                                                                                                 25.6s 

    42c077c10790 Pull complete                                                                                                 11.5s 

    44b062e78fd7 Pull complete                                                                                                 11.9s 

    b3ba9647f279 Pull complete                                                                                                 11.9s 

    10c9a58bd495 Pull complete                                                                                                 16.1s 

    ed9bd501c190 Pull complete                                                                                                 16.2s 

    03346d650161 Pull complete                                                                                                 24.8s 

    539ec416bc55 Pull complete                                                                                                 24.8s 

[+] Running 2/2

  Network learngo_default    Created                                                                                            0.1s 

  Container learngo-kafka-1  Started     

Complete Commands Summary:

  • Stop Existing Containers:
    $docker-compose down
    
  • Start Kafka Service:
    $docker-compose up -d
    
  • Check Container Status:
    $docker-compose ps
    
  • View Container Logs:
    $docker-compose logs
    
  • List Running Containers:
    $docker ps
    

Step 6: Create Topics

We'll use Kafka commands to create topics where messages will be published.

a. List Running Containers:

$docker ps
S*****s-MacBook-Pro:learngo s*********i$docker ps
CONTAINER ID   IMAGE                    COMMAND                  CREATED        STATUS        PORTS                                                NAMES
385270269b49   wurstmeister/kafka       "start-kafka.sh"         24 hours ago   Up 24 hours   0.0.0.0:9092->9092/tcp                               learngo-kafka-1
752d1dcbe150   wurstmeister/zookeeper   "/bin/sh -c '/usr/sb…"   24 hours ago   Up 24 hours   22/tcp, 2888/tcp, 3888/tcp, 0.0.0.0:2181->2181/tcp   learngo-zookeeper-1

b. Access the Kafka Container:

$docker exec -it <container-id> /bin/bash
S*****s-MacBook-Pro:learngo s******i$ docker exec -it 385270269b49  /bin/bash
root@385270269b49:/# 

Here, the 385270269b49 is the container ID of your Kafka container from the previous command.

c. Create a Topic:

Run below command to create topic
$kafka-topics.sh --create --topic learngo-person --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

 root@385270269b49:/# kafka-topics.sh --create --topic learngo-person --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

Created topic learngo-person.

root@385270269b49:/# 

This command creates a topic named learngo-person with one partition and a replication factor of 1 (suitable for a local development environment).

Step 7: View Topic Details

Once you've created a topic, you can view its details using the command:

$kafka-topics.sh --describe --topic learngo-person --bootstrap-server localhost:9092

This command will display information about the newly created topic, including the number of partitions, replication factor, and configuration settings.


root@385270269b49:/# kafka-topics.sh --describe --topic learngo-person --bootstrap-server localhost:9092

Topic: learngo-person   TopicId: 3gZMZ-tmTlesfCfmO9Dwng PartitionCount: 1       ReplicationFactor: 1    Configs: segment.bytes=1073741824

        Topic: learngo-person   Partition: 0    Leader: 1001    Replicas: 1001  Isr: 1001

root@385270269b49:/# 

Step 8: Produce Messages

You can write a producer application to send messages to the Kafka topic(Detail in Part-14 of this series). Alternatively, you can use the kafka-console-producer.sh tool:

$kafka-console-producer.sh --topic learngo-person --bootstrap-server localhost:9092

At the prompt, type your message and press Enter. For example:

>{"person":{"firstName":"learn","lastName":"go"}}

Example:

root@385270269b49:/# kafka-console-producer.sh --topic learngo-person --bootstrap-server localhost:9092

>{"person":{"firstName":"learn",lastName:"go"}}

>

Step 9: Consume Messages


Write a consumer application to read messages from the Kafka topic(Detail in Part 14 of this series). Again, you can use the kafka-console-consumer.sh tool:
$docker exec -it <container_id> /bin/bash

Replace <container_id> with the actual ID of your Kafka container.

Example:
S*****s-MacBook-Pro:learngo s******i$ docker exec -it 385270269b49  /bin/bash
root@385270269b49:/# 

Inside the container, run:

$kafka-console-consumer.sh --topic learngo-person --from-beginning --bootstrap-server localhost:9092

This command will start consuming messages from the learngo-person topic, printing them to the console.

Example:

root@385270269b49:/# kafka-console-consumer.sh --topic learngo-person --from-beginning --bootstrap-server localhost:9092

{“person":{"firstName":"learn",lastName:"go"}}

Testing and Verification

You can publish more messages from the producer terminal (Step 8) and observe them being consumed in the consumer terminal (Step 9). Press Ctrl+C in each terminal to stop the producer and consumer.

Docker Desktop Verification

Open Docker Desktop and you should see two running containers: one for ZooKeeper and one for Kafka.



This guide demonstrated how to set up a local Kafka environment using Docker Compose, create topics, and use console tools to produce and consume messages. In the next part, we'll explore how to interact with Kafka using Go programs.

Reference


https://pkg.go.dev/github.com/segmentio/kafka-go

https://kafka.apache.org/intro

https://kafka.apache.org/quickstart

https://www.conduktor.io/kafka/how-to-install-apache-kafka-on-mac-with-homebrew/

https://www.conduktor.io/kafka/how-to-start-kafka-using-docker/

https://pkg.go.dev/cloud.google.com/go/pubsub

https://cloud.google.com/pubsub/docs/overview



Sunday, May 26, 2024

Mastering Go: Part 12 - Program debugging , Profiling and Performance Evaluation

The Go has several APIs and tools available to evaluate the performance and logic of the program. All the diagnostics tools fall under different categories. The aim of these tools is to identify the performance and/or logical issues with the Go program.

Profiling

Profiling is the process of identifying the expensive code blocks in the project. The pprof tool will be used to visualize the data generated by the profiling. Go has a standard library 'pprof' and can be used to visualize the profiling data. the profiler generated the data in the specific format expected by the pprof tool. The profiling data can be collected during testing via go test or endpoints made available from the net/http/pprof package.  

The ppfrof package contains several of the APIs to provide data in a specified/defined structure.

There are few built-in profiles included in the pprof package

CPU - CPU profile reports

HEAP - heap profile reports

threadcreate - profile reports for the  program that creates new process threads

goroutine - goroutine profile report

block - program block waiting on synchronization  profile report

mutex -lock contention report

Install profiling Tools

To generate and visualize the profile report we need a few tools installed. Let's first make sure all the tools required are installed.
First, install the pprof tool using the below command :

$ go install GitHub.com/google/pprof@latest

Run the below command to make sure pprof is working:
$ pprof —version

Now, install the data visualization tool Graphviz. We can install it using Homebrew (for macOS users) or follow the instructions on this Graphviz download.

Bash command:
$ brew install graphviz

Note: If Homebrew is not already installed on your machine, either use the macOS .dmg file or install Homebrew first.

Generate profile reports

There are several ways to generate a program profile. We can generate profiles while running unit tests, by writing code to generate profiles, or by generating live profiles while running the application/services.

Examples: Generate a profile with a unit test. We are using the unit test written as part of this learning series: Part 11: Testing Go Projects.

To generate the profile using a unit test run, please use the command below:

$ go test -cpuprofile cpu.prof -memprofile mem.prof -bench .
 
After the completion of the test, two files, cpu.prof and mem.prof, will be generated. To open and diagnose these files, we need to use the pprof tool.

For any issue/error installing ppform refer to the appendix section of this blog.

Visualize the profile report

In Terminal:
To View the report in the terminal use the below command:

$pprof mem.prof

S*****s-MacBook-Pro:string_formater s********$ pprof mem.prof
File: string_formater.test
Type: alloc_space
Time: May 24, 2024 at 6:37am (EDT)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof)

S******s-MacBook-Pro:string_formater s********i$ pprof cpu.prof
File: string_formater.test
Type: cpu
Time: May 24, 2024 at 6:37am (EDT)
Duration: 201.83ms, Total samples = 0
No samples were found with the default sample value type.
Try "sample_index" command to analyze different sample values.
Entering interactive mode (type "help" for commands, "o" for options)
(pprof)

After executing the above command, we can run several other commands to get insights from the profile. Some of these commands are top, web, etc. Additionally, pprof has several functions to generate specific profiles based on our specific needs.

In the web interface:

We can use the web interface to visualize and read the .prof file.
Command:
$ pprof -http=:8090 cpu.prof

Now, you can visualize the result in the browser using Graphviz.



Note: If you encounter any errors please refer to the common errors sections to fix them:

Writing profiling in the program: 

To write profiling code in your Go main package, you'll need to use the runtime/pprof and os packages to create CPU and memory profiles. Here's an example of how to add profiling code to your main.go file:
We can use a build tag to execute the profiling code only in debug mode.
Example
Continue from the previous exercise. Modify the main.go to the file and add the code below to generate CPU and memory profiles. The profile will be generated after each execution of the main. To avoid extra resource consumption in production, we can move the profiling code into a separate file and only execute it in debug mode.

package main

import (
"encoding/json"
"errors"
"flag"
"fmt"
"io"
dateformater "learngo/date_formater" // import data formater package
stringformater "learngo/string_formater" // import string formater package
"log"
"net/http"
"os"
"runtime"
"runtime/pprof"
)

var cpuprofile = flag.String("cpuprofile", "cpu.prof", "write cpu profile to `file`")
var memprofile = flag.String("memprofile", "mem.prof", "write memory profile to `file`")

func main() {

flag.Parse()
// CPU profile
if *cpuprofile != "" {
f, err := os.Create(*cpuprofile)
if err != nil {
log.Fatal("could not create CPU profile: ", err)
}
defer f.Close() // error handling omitted for example
if err := pprof.StartCPUProfile(f); err != nil {
log.Fatal("could not start CPU profile: ", err)
}
defer pprof.StopCPUProfile()
}
.................. other code
......................

// memory profile
if *memprofile != "" {
f, err := os.Create(*memprofile)
if err != nil {
log.Fatal("could not create memory profile: ", err)
}
defer f.Close() // error handling omitted for example
runtime.GC() // get up-to-date statistics
if err := pprof.WriteHeapProfile(f); err != nil {
log.Fatal("could not write memory profile: ", err)
}
}
}

Tracing

The GO has packages runtime/trace to capture the trace of the program execution. It can capture the trace of the individual blocks/goroutine of the code. Using the trace visualization tools, we can visualize and analyze the trace to pinpoint the issues. The trace command captures various execution events i.e. GC events, heap sizes, blocking/unblocking, events and goroutines details, etc.
Command to generate a trace for a test run.
$go test -trace=trace.out

To visualize the trace, we can use the pprof package's tool:
$go tool trace trace.out

After executing the trace visualize tool, we can see the trace in the browser:


Similar to writing profile reports in the program, we can start writing tracing in the program and capture all the events while executing the program for debugging purposes.
We can use the different annotations to trace different data they are:
log - capture the trace of execution logs
region - capture the time interval of the goroutines
task - capture the trace of the logical operations such as RPC, and HTTP requests.

Debugging

To debug a Go program, we need to set up the development environment with a Go code debugger. One of the common and popular Go debuggers is Delve. We can easily install Delve on Visual Studio Code as an extension.

Command to install Delv:
$ go install github.com/go-delve/delve/cmd/dlv@latest

We can use the command terminal to start and use the devl debugging, some commands are:

(dlv)$ start - start debugger
(dlv )$ break main.go : 35 - add the breakpoint in the main.go to file line no 35
(dlv )$ continue - continue the execution
(dlv)$ step - step through the code line
(dlv)$ next - go to the next line
(dlv)$ print variable name - print the value of the variable
(dlv)$ clear main.go:35   -  remove the breakpoint from the line no 35

After installing Delve in VS Code, we can execute the Go program in debug mode by selecting "Run" and then "Start Debugging". The program will start in debug mode, and we can navigate through the lines using the debugging mode panels.


Reference

https://go.dev/blog/pprof
https://go.dev/doc/diagnostics
https://pkg.go.dev/runtime/trace
https://go.dev/blog/execution-traces-2024
https://github.com/google/pprof/blob/main/doc/README.md
https://www.practical-go-lessons.com/chap-36-program-profiling

Monday, May 20, 2024

Mastering Go: Part 11 - Testing Go Projects

All programmers should consider writing unit tests while writing code. If the code is not unit test friendly, it will be tricky to write proper unit tests. For example, writing a unit test for the function below is difficult because the function doesn't use dependency injection and creates an instance of a repository inside it.

func GetPersonNames() ([]Person, error) {
fmt.Println("Inside Get Person function")
dbOperation := databaseLayer.NewDatabaseOperation()
// repository
repo := databaseLayer.NewRepository(dbOperation)
// Query to select data from the database table
query := `SELECT "PersonId", "FirstName", "LastName", "CreatedDate", "UserId"
FROM learngo."Person"`

rows, err := repo.ExecuteSelect(query)
....
...
}

Now, modify the function to use the dependency injection pattern. Injecting the repository into the function makes the code more unit-test-friendly. This way, we can easily mock the repository and inject it into the function, allowing us to write unit tests without involving a real database connection.

func GetPersonNames(repo *databaseLayer.Repository) ([]Address, error) {
query := `SELECT "PersonId", "FirstName", "LastName", "CreatedDate", "UserId"
FROM learngo."Person"`

rows, err := repo.ExecuteSelect(query)
....
...
}


Observe the difference between the two methods above. In the first method, we can't use a mocked repository. In the second method, however, we can simply inject the mocked repository to write the unit test.

Go Frameworks and libraries for Testing

There are several libraries available in Go to help with writing unit tests and creating mocks. We will use and discuss some of them here.

gomock

gomock is a mocking framework for Go used to mock Go interfaces for their implementation for testing purposes.

To add the gomock library/package, please run the following command:

$ go get github.com/golang/mock/gomock

mockgen: Mock generator

mockgen is a tool to generate mock files for Go testing. It generates mock files based on the interface definitions in your Go code. To install the mockgen tool, use the following command:

$ go install github.com/golang/mock/mockgen@v1.6.0

After installing the mockgen library, you will be able to generate mock files. In this exercise, we will generate a mock file for db_repository, where we are connecting to the database and executing queries. For testing, we will use a mocked connection and result instead of the actual database.

To verify that mockgen is installed and available for use, run the below command

$ mockgen --version

If you encounter the error "mockgen: command not found," add the Go binary path to your system's PATH. Generally, the Go binaries are located in $HOME/go/bin.

Use the below steps  to update the go binary PATH:

Step 1. Open the .bashrc file in edit mode

$nano ~/.bashrc

Step 2. Add the following line to the file

export PATH=$PATH:$HOME/go/bin

Step 3:  Save the file and reload

$source ~/.bashrc

Now, mockgen is ready to generate the mock file. We will see how to generate the mock file later in the exercise.

sqlmock

sqlmock is a library built to mock the SQL interaction from the code. It mocks the database driver simulates the interaction with the database and returns the mocked data as defined i.e. executing the queries, stored procedures, etc without connecting to the actual database for testing the code.   

We can easily integrate the sqlmock with other testing frameworks in Go like testify, gomock, testing, etc.

Command to install go-sqlmock library/package:

$go get github.com/DATA-DOG/go-sqlmock

 
testify

Testify is another popular testing framework widely used in Go testing. It provides several features and libraries to write unit tests in Go. Testify integrates with mock libraries like gomock or mockgen, enhancing its capabilities for writing reliable and maintainable tests.
Testify has several features to support writing reliable and maintainable tests.
Assertion: Testify facilitates adding assertions in unit tests to verify the actual output against the expected output.
Test Suite: It enables the grouping of tests, allowing for better organization and structuring of test code.
Mock Support: Testify supports popular mock libraries like gomock, mockgen, etc., providing flexibility in mocking dependencies for testing. 

To install testify, use the below command
$go get github.com/stretchr/testify

After installing Testify, you'll have access to several important packages for testing:
github.com/stretchr/testify/assert: Provides assertion functions for writing test assertions.
github.com/stretchr/testify/require: Similar to assert, but stops test execution immediately upon failure.
github.com/stretchr/testify/mock: Support for mocking dependencies in tests github.com/stretchr/testify/suite: Enables the creation of test suites for organizing related tests.
 

testing

The Go package testing provides support for automated testing in golnag program. the package comes with the comment $go test, to execute the unit test written in the package.

Code Coverage

High code coverage is essential for building software intended to run for several years. A higher code coverage ensures that the code is properly tested and covers all possible execution paths. While ideally, the coverage should be 100%, there may be some exceptions where certain parts of the code cannot be mocked or reproduced in the test environment.

Go provides built-in functionality to evaluate code coverage after running test cases. You can use the command $go test -cover to execute tests and analyze code coverage. After running this command, you will see the test execution results along with the coverage report.

This command helps developers assess the effectiveness of their tests and identify areas of the code that need more testing. By achieving the higher code coverage, developers can increase confidence in the reliability and stability of their software over time.

Example: 

Continue from the Part 10 of this series(Mastering Go: Part 10 - Writing Web API in Golang)
In this example we will write unit test for the function GetAddress from the stringformater module.

Step 1:  If the code written is not unit test-friendly, then it's hard to write unit tests. Therefore, before starting to write unit tests, we need to fix or rewrite the code that is not unit test-friendly. I explained this at the beginning of this section and showed how to convert code to unit test-friendly code.
Modify address_formater.go and use dependency injection to inject the Repository into the function GetAddress.
Modify address_handler.go file to call the GetAddress function by injecting a repository instance.
After modification, the code will look like this:
address_formater.go
package stringformater

import (
"fmt"
databaseLayer "learngo/db_operation" // import data formatter package
)

// Address struct
type Address struct {
HouseNumber string
StreetName string
City string
State string
ZipCode string
}

func (a *Address) Format() string {
return a.HouseNumber + " " + a.StreetName + ", " + a.City + ", " + a.State + " " + a.ZipCode
}

// This function call DB operation and return the collection of address
func GetAddress(repo *databaseLayer.Repository) ([]Address, error) {

// Query to select data from the database table
query := `SELECT "HouseNumber", "StreetName", "City", "State", "ZipCode"
FROM learngo."Address"`

// Call the eecuteselect DB operation
rows, err := repo.ExecuteSelect(query)
if err != nil {
return nil, err
}

// Initialize a slice to store Address structs
var addresses []Address

// Iterate over the rows
for rows.Next() {
var address Address
// Scan the values into variables
err := rows.Scan(&address.HouseNumber, &address.StreetName, &address.City, &address.State, &address.ZipCode)
if err != nil {
return nil, fmt.Errorf("error scanning row: %v", err)
}

// Append the scanned address to the slice
addresses = append(addresses, address)
}

// Check for errors during iteration
if err := rows.Err(); err != nil {
return nil, fmt.Errorf("error iterating over rows: %v", err)
}

// Return the list of address structs
return addresses, nil
}


address_handler.go
package handlers

import (
"fmt"
databaseLayer "learngo/db_operation" // import data formatter package
stringformater "learngo/string_formater" // import string formater package
"net/http"

"github.com/gin-gonic/gin"
)

// Get the address from the database and respond with the list of all addresses as JSON.
func GetAddress(c *gin.Context) {

// initializ DB operation
dbOperation := databaseLayer.NewDatabaseOperation()

// initialze the reporsitory injecting DatabaseOperations interface
repo := databaseLayer.NewRepository(dbOperation)
//get addresses
addresses, err := stringformater.GetAddress(repo)
if err != nil {
fmt.Println("Exception occured to get address")
panic(err)
}
c.IndentedJSON(http.StatusOK, addresses)
}



Step 2:  Generate mock db operation
Note that we are using a DB connection to retrieve the address from the database. Since using the real database connection is not recommended for writing unit tests, we need to mock the DB activity.

Now, run the following command to generate a mock file for the db_operation.go file. If you have not completed the required library installation as described above, please do so before executing this command.

$mockgen -source=db_operation.go -destination=mocks/mock_db_operation.go -package=mocks

The autogenerated file content by mockgen looks like this:

// Code generated by MockGen. DO NOT EDIT.
// Source: db_operation.go

// Package mocks is a generated GoMock package.
package mocks

import (
sql "database/sql"
reflect "reflect"

gomock "github.com/golang/mock/gomock"
)

// MockDatabaseOperations is a mock of DatabaseOperations interface.
type MockDatabaseOperations struct {
ctrl *gomock.Controller
recorder *MockDatabaseOperationsMockRecorder
}

// MockDatabaseOperationsMockRecorder is the mock recorder for MockDatabaseOperations.
type MockDatabaseOperationsMockRecorder struct {
mock *MockDatabaseOperations
}

// NewMockDatabaseOperations creates a new mock instance.
func NewMockDatabaseOperations(ctrl *gomock.Controller) *MockDatabaseOperations {
mock := &MockDatabaseOperations{ctrl: ctrl}
mock.recorder = &MockDatabaseOperationsMockRecorder{mock}
return mock
}

// EXPECT returns an object that allows the caller to indicate expected use.
func (m *MockDatabaseOperations) EXPECT() *MockDatabaseOperationsMockRecorder {
return m.recorder
}

// ExecuteSelect mocks base method.
func (m *MockDatabaseOperations) ExecuteSelect(query string) (*sql.Rows, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "ExecuteSelect", query)
ret0, _ := ret[0].(*sql.Rows)
ret1, _ := ret[1].(error)
return ret0, ret1
}

// ExecuteSelect indicates an expected call of ExecuteSelect.
func (mr *MockDatabaseOperationsMockRecorder) ExecuteSelect(query interface{}) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "ExecuteSelect", reflect.TypeOf((*MockDatabaseOperations)(nil).ExecuteSelect), query)
}

 

After generating the mocked file the project structure should look like this: 

 /Users/s********i/Documents/learngo/

├── db_operation/ │ ├── db_operations.go // Contains the DatabaseOperations interface │ ├── db_repository.go // Contains the Repository struct and methods │ ├── mocks/ │ │ └── mock_db_operations.go // Destination for the generated mock └── go.mod


Step 3: Add test file and test case

Now add the test file named address_formater_test.go in the folder string_formater and add below code:

package stringformater

import (
"database/sql"
db_repository "learngo/db_operation"
"learngo/db_operation/mocks"
"reflect"
"testing"

"github.com/DATA-DOG/go-sqlmock"
"github.com/golang/mock/gomock"
)

// Provide correct address data for address formatting
func Test_Format(t *testing.T) {
tests := []struct {
name string
address Address
expected string
}{
{
name: "standard address",
address: Address{
HouseNumber: "123",
StreetName: "Main St",
City: "Springfield",
State: "IL",
ZipCode: "62704",
},
expected: "123 Main St, Springfield, IL 62704",
},
{
name: "address with no state",
address: Address{
HouseNumber: "789",
StreetName: "Pine St",
City: "Atlanta",
State: "",
ZipCode: "30303",
},
expected: "789 Pine St, Atlanta, 30303",
},
}

for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := tt.address.Format(); got != tt.expected {
t.Errorf("Address.Format() = %v, want %v", got, tt.expected)
}
})
}
}

// Unit test to test the getAddress function
func Test_GetAddress(t *testing.T) {
ctrl := gomock.NewController(t)
defer ctrl.Finish()

// Get the instance of mock DB_operation
mockDB := mocks.NewMockDatabaseOperations(ctrl)

// Query to execute
query := `SELECT "HouseNumber", "StreetName", "City", "State", "ZipCode"
FROM learngo."Address"`

// Create the instance of SQLmock to fake execute the sql query
db, mock, err := sqlmock.New()
if err != nil {
t.Fatalf("Error occured on creating mocked sql instance : '%s'", err)
}
defer db.Close()

// Prepare data for test
rows := sqlmock.NewRows([]string{"HouseNumber", "StreetName", "City", "State", "ZipCode"}).
AddRow("3800", "Market St", "Frederick", "MD", "21701").
AddRow("4500", "Walnut St", "Chevy Chase", "MD", "21901")

// Set the expectation of the query
mock.ExpectQuery(query).WillReturnRows(rows)

// set the mock expect for the execute select
mockDB.EXPECT().ExecuteSelect(query).DoAndReturn(func(query string) (*sql.Rows, error) {
return db.Query(query)
})

// Get repository using MOCKED DB
repo := db_repository.NewRepository(mockDB)

// Execute the GetAddress function with repo instance created with mocked DB
addresses, err := GetAddress(repo)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}

// set the expected data for comparison
expected := []Address{
{HouseNumber: "3800", StreetName: "Market St", City: "Frederick", State: "MD", ZipCode: "21701"},
{HouseNumber: "4500", StreetName: "Walnut St", City: "Chevy Chase", State: "MD", ZipCode: "21901"},
}

// Assert
if !reflect.DeepEqual(addresses, expected) {
t.Errorf("Test actual result got %v, Expected result: %v", addresses, expected)
}
}


Take some time to understand the above code, please carefully review:
  • Observe how the test struct is created and looped through to run the tests for the address format function.
  • Observe how the gomock controller is used to initiate the mocked db_operations instance.
  • Observe the use of mocked db_operation in the db_repository to fake the database interaction inside the repository.
  • Notice that we are suing go standard library reflect to compare the actual vs expected values in the test
Step 4: Run test case
Executing test case in go is fairly simple, just need to run this command
S*****-MacBook-Pro:string_formater s********i$ go test
PASS
ok      learngo/string_formater 0.251s

Step 5: Run test with code coverage

S*****-MacBook-Pro:string_formater s*******i$ go test -cover
PASS
coverage: 34.3% of statements
ok      learngo/string_formater 0.246s

Step 6. Use assertion
If you prefer to use the assert package instead of comparing results using reflection, you can utilize the assert package provided by the testify framework. Now, modify the assertion part of the unit test Test_GetAddress. After modification, the code looks like below:

..........
// Execute the GetAddress function with repo instance created with mocked DB
addresses, err := GetAddress(repo)
assert.NoError(t, err, "unexpected error")
.............
.............

// Assert
assert.Equal(t, expected, addresses, "address slice mismatch")


Observe the code change above: we are now using the "assert" package instead of "reflect" to validate test case outputs. Utilizing the assert package makes the code cleaner and easier to implement.

Exercise Code Link:  https://github.com/learnwithsharad/learngo/tree/sharad-AddUnitTests

Reference

https://pkg.go.dev/testing
https://pkg.go.dev/github.com/stretchr/testify@v1.9.0/assert
https://go.dev/doc/tutorial/add-a-test


Mastering Go: Part 14 - Messaging with Apache Kafka(Go Implementation)

In this post, we will explore how to implement Apache Kafka messaging in Golang. Several packages are available, and the best choice depends...