HOME BLOG

How to set environment variables on Docker containers via docker-compose file

Posted on: August 18th, 2022 by Olu No Comments

Hi folks,

In this post I briefly go over how to set environment variables in your docker container which is managed by a docker-compose.yaml file.

You can use the env_file flag. Suppose you set environment variables in a file .env, you can read environment variables into a container, service_name as follows:

services:
    service_name:
        container_name: service_name
        ...
        env_file:
          - .env

That’s all for now. Till next time, happy software development.

How to install Python 3.6 on MacOS 12 using pyenv

Posted on: August 17th, 2022 by Olu No Comments

Hi folks,

Here’s a quick note on how to install Python 3.6 on MacOS 12 using pyenv.

I needed to do this while workingon a legacy app. Note that Python 3.6 has reached its end-of-life so you should normally be using a more recent version of the langauge as a developer.

Here is the error I was getting:

python-build: use readline from homebrew
Installing Python-3.6.0...
python-build: use tcl-tk from homebrew
python-build: use readline from homebrew
python-build: use zlib from xcode sdk

BUILD FAILED (OS X 12.0.1 using python-build 20180424)

Inspect or clean up the working tree at /var/folders/q9/s5s1hzrd6m1_0by3tx5j6sh40000gn/T/python-build.20220815235540.47954
Results logged to /var/folders/q9/s5s1hzrd6m1_0by3tx5j6sh40000gn/T/python-build.20220815235540.47954.log

Last 10 log lines:
clang -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include   -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include   -std=c99 -Wextra -Wno-unused-result -Wno-unused-parameter -Wno-missing-field-initializers   -I. -I./Include -I/usr/local/opt/readline/include -I/usr/local/opt/readline/include -I/Users/olu/.pyenv/versions/3.6.0/openssl/include -I/Users/olu/.pyenv/versions/3.6.0/include  -I/usr/local/opt/readline/include -I/usr/local/opt/readline/include -I/Users/olu/.pyenv/versions/3.6.0/openssl/include -I/Users/olu/.pyenv/versions/3.6.0/include   -DPy_BUILD_CORE  -c ./Modules/pwdmodule.c -o Modules/pwdmodule.o
./Modules/posixmodule.c:8146:15: error: implicit declaration of function 'sendfile' is invalid in C99 [-Werror,-Wimplicit-function-declaration]
        ret = sendfile(in, out, offset, &sbytes, &sf, flags);
              ^
./Modules/posixmodule.c:10340:5: warning: code will never be executed [-Wunreachable-code]
    Py_FatalError("abort() called from Python code didn't abort!");
    ^~~~~~~~~~~~~
1 warning and 1 error generated.
make: *** [Modules/posixmodule.o] Error 1
make: *** Waiting for unfinished jobs....

After a lot of research, a solution was found on StackOverflow as follows:

CFLAGS="-I$(brew --prefix openssl)/include -I$(brew --prefix bzip2)/include -I$(brew --prefix readline)/include -I$(xcrun --show-sdk-path)/usr/include" 

LDFLAGS="-L$(brew --prefix openssl)/lib -L$(brew --prefix readline)/lib -L$(brew --prefix zlib)/lib -L$(brew --prefix bzip2)/lib"

pyenv install --patch 3.6.13 < <(curl -sSL https://github.com/python/cpython/commit/8ea6353.patch\?full_index\=1)

Hopefully this helps others. Till next time, happy software development.

 

References

1. Problems installing python 3.6 with pyenv on Mac OS Big Sur. https://stackoverflow.com/a/68227540.

SonarQube – a tool for monitoring code quality

Posted on: August 6th, 2022 by Olu No Comments

Hi folks,

In this post I will talk about an interesting tool you can use to monitor the quality of your code base when doing software development. The tool is called SonarQube. SonarQube is an open-source platform created by SonarSource and helps with continuous inspection of code quality to perform automatic reviews with static analysis of code to detect bugs, code smells and vulnerabilities. It supports 17 programming languages.

I really like SonarQube because it can help you write more secure and robust code. It’s web user interface is also quite intuitive to use. It highlights issues and gives suggestions on how to fix them, which is pretty helpful.

You can integrate SonarQube checks into your continuous integration pipeline to do code quality checks automatically when code is merged into certain branches or on pushing code into certain branches of your repository. You can set thresholds for various parameters like bugs, vulnerabilities that can be present in a build before failing the build.

There is a community edition of the software, which is free. There are also other paid versions e.g. Developer, Enterprise and Data Center editions.

Thus, if your team really cares about monitoring code quality, I highly recommend SonarQube. That’s all for now. Till next time, happy software development.

References

SonarQube. Wikipedia. https://en.wikipedia.org/wiki/SonarQube.

Dowloads | SonarQube. https://www.sonarqube.org/downloads/.

Yinkos Hymns Manager is launched

Posted on: July 22nd, 2022 by Olu No Comments

Hi folks,

We’re pleased to announce the launch our latest web application called Yinkos Hymns Manager.

Yinkos Hymns Manager helps you manage your church hymns. It provides a number of very useful features, including

  • Providing a central repository for storing audio files e.g. recorded hymns, anthems, or other music.
  • Allowing a team of users to access all hymns created by the group.
  • Recording of hymns sung on specific dates.
  • Suggesting hymns for future events.

You can use Yinkos Hymns Manager on any device that has a web browser, be it on a desktop, laptop, mobile phone or tablet.

To access the app, visit Yinkos Hymns Manager website.

Find out how it works by watching the Intro Video below.

As always, we like to know what you think about the game. Feel free to contact us.

Enjoy.

pretty-quick: A Great Code Formatting library

Posted on: July 22nd, 2022 by Olu No Comments

Hi folks,

In this post I will talk about a nice code-formatting tool for your web applciations, useful if you use JavaScript, HTML, etc. It’s called pretty-quick.

pretty-quick runs Prettier on your changed files. That is, if you use a version control system like Git and you change some files, pretty-quick will run Prettier on the changed files so that they have a nice format. Information about this library can be found here.

What excites me the most about this library is when using it as a pre-commit hook. This can be done easily using Husky. A guide for this can be found in the link above.

So, if you want nice consistent code formatting for your front-end web application source code, I recommend pretty-quick.

That’s all for now. Till next time, happy software development.

How to mock Python functions with Pytest

Posted on: June 8th, 2022 by Olu No Comments

Hi folks,

In this post I talk about a nice way to mock functions when developing Python applications and using Pytest.

There are many ways of doing mocking in Python. But in this post I will cover how to do it using a nice library called pytest-mock.

To use pytest-mock, you need to install it using a command

 

pip install pytest-mock

 

Say you want to mock a function called is_eligible inside a module called application. You just need to use the mocker fixture which is provided by pytest-mock.

So your function can look as follows.

 

def test_some_function(mocker):
    mocker.patch('application.is_eligible', return_value=True)
    assert some_func_using_is_eligible() == True

 

That’s all for now. Till next time, happy software development.

 

References

Mocking functions in Python with Pytest Part I. https://www.freblogg.com/pytest-functions-mocking-1

Handy SQL query for backups

Posted on: April 13th, 2022 by Olu No Comments

Hi folks,

In this post I share a quick and handy technique for backing up tables within a database. It’s the CREATE TABLE AS statement.

You can use it to create a table by copying data from an existing table e.g.

CREATE TABLE new_table
  AS (SELECT * FROM old_table);

 

You can even back up specific columns or rows that meet certain conditions e.g.

CREATE TABLE suppliers
  AS (SELECT companies.id, companies.address, categories.cat_type
      FROM companies, categories
      WHERE companies.id = categories.id
      AND companies.id > 1000);

 

Note: this is a light-weight backup and shouldn’t be used a substitute for a full backup of the entire database.

That’s all for now. Till next time, happy software development.

 

Reference

SQL: CREATE TABLE AS Statement. https://www.techonthenet.com/sql/tables/create_table2.php

How to use Kubernetes jobs to run short one-off tasks

Posted on: February 25th, 2022 by Olu No Comments

Hi folks,

In this post I discuss how to use jobs to run short one-off tasks in applications that use Kubernetes for container orchestration.

In your application, you may have certain one-off tasks, e.g. pre-populating your database with certain information. How do you go about doing this in an automated fashion?

You can do this using Kubernetes Job.

The idea is to create an idempotent script that can populate the database as needed. Then create a Kubernetes job to run that script. This way, whenever you deploy your application, a job will be created to run the script to populate your database.

Why should the script be idempotent? Well, by making it idempotent, you can extend your job to handle new data, e.g. add more data to use for pre-populating your app database. Then when next the job runs, only the data data will be processed.

An advantage of performing short tasks in a separate task is that if the task fails, it’s easy to tell by looking at the status of the job, as opposed to if you bundle both short tasks with the main application pod. This applies if your main application is something like a web service which is usually a long-running application.

In summary, you can run short one-off tasks using Kubernetes jobs. By making your tasks idempotent, you can run them safely every time you deploy your app. That’s all for now. Till next time, happy software development.

How to build and deploy application container images with Skaffold and Kaniko

Posted on: January 19th, 2022 by Olu No Comments

Hi folks,

In this post I talk about an interesting way to build your application Docker containers and deploy them into your artifact repository e.g. Artifactory. These tools are useful if you use Kubernetes to orchestrate your container deployment.

You can perform builds using a tool called Skaffold. Skaffold can be used in multiple ways. You can perform your build one-time or you can have Skaffold watch your project and automatically build Docker containers when the code changes. Read more about Skaffold here.

Kaniko is a tool to build container images from a Dockerfile, inside a container or Kubernetes cluster. Kaniko doesn’t depend on a Docker daemon and executes within a Dockerfile completely in userspace. This allows us to easily build container images in environments where it’s not convenient to run a Docker daemon e.g. in a standard Kubernetes cluster. Read more about Kaniko here

Using these two tools together you can have your CI server build and deploy app containers to your artifact repo when certain events occur e.g. when code is merged into your development branch.

That’s all for now. Till next time. Happy software dev.

References

1. Skaffold. https://skaffold.dev/
2. Kaniko. https://github.com/GoogleContainerTools/kaniko

OpenShift

Posted on: January 15th, 2022 by Olu No Comments

Hi folks,

In this post I talke about a very interesting I have only come across recently. It’s called OpenShift Container Platform. It’s an on-premise platform-as-a-service built around Linux containers, orchestrated by Kubernetes on a foundation of Red Hat Enterprise Linux. OpenShift is developer by Red Hat.

It’s an amazing tool that makes it easy to manage your applications if you use Kubernetes to orchestrate your app containers.

It provides a nice web interface that allows you view the various Kubernetes resources for your app like services, pods, stateful sets, etc. It also allows you view logs, terminals, events, metrics, etc. for your various Kubernetes pods. This makes it a lot easier to administer your application and means you don’t have to always resort to CLI tools like kubectl to monitor your application resources.

OpenShift also provides a really nice CLI tool called oc that allows you check on your Kubernetes resources from a terminal. oc is a tool that provides a superset of the functionality of kubectl.

The main difference between OpenShift and vanilla Kubernetes is the concept of build-related artifacts. In OpenShift, such artifacts are first-class Kubernetes resources upon which standard Kubernetes operations can apply.

So, if you plan to use Kubernetes to manage app deployment, I recommend giving OpenShift a shot. That’s all for now. Till next time, happy software development.

References

1. OpenShift. https://en.wikipedia.org/wiki/OpenShift