I thought macOS had it all.
I was wrong, it starts to resemble HPC machine

OK, I thought I can survive basing purely on what macOS can provide me with. I was so wrong. My mpich installations start to look, oh so, HPC alike.

Now, let me introduce you to my most recent mpich installation ;)


OpenCoarrays using homebrew – I guess you have to be lucky to get it working :(

Yet another way of using OpenCoarrays is to work with homebrew. However, there is one, small, issue. Note that OpenCoarrays may not work with OpenMPI. You can find description of the issue here: https://github.com/sourceryinstitute/OpenCoarrays/issues/625, and here is the original post on StackOverflowWhy does coarray with allocatable component creates segmentation faults when accessed from different image?.

If you want to have it run via homebrew, you have to follow these steps:

> cd ~/opt
> mkdir homebrew
> curl -L https://github.com/Homebrew/brew/tarball/master | tar xz --strip 1 -C homebrew
> cd homebrew
> ./bin/brew cleanup
> ./bin/brew install gcc
> ./bin/brew install opencoarrays

once you have it, you can try with sample code (make sure to put it inside ~/tmp/hello.f90).

program hello_world
implicit none
  write(*,*) 'Hello world from ', &
   this_image() , 'of', num_images()
end program hello_world

and … it doesn’t work as expected :(

> cd ~/tmp
> ~/opt/homebrew/bin/caf -o hello hello.f90
ld: warning: directory not found for option '-L/Users/.../opt/homebrew/Cellar/open-mpi/4.0.1_2/lib'

The issue here is that for some reason (I don’t know what, yet), caf wrapper tries to use incorrect location for MPI libraries.

|-- cmake
|   `-- opencoarrays -> ../../Cellar/opencoarrays/2.8.0/lib/cmake/opencoarrays
|-- gcc -> ../Cellar/gcc/9.2.0_3/lib/gcc
|-- openmpi -> ../Cellar/open-mpi/4.0.2/lib/openmpi
|-- pkgconfig
`-- pmix -> ../Cellar/open-mpi/4.0.2/lib/pmix

if you play a little bit, you can get it working. Just make sure to make caf happy when it comes to location of libraries provided by OpenMPI.

> cd ~/opt/homebrew/Cellar/open-mpi/
> ln -s 4.0.1_2

And now, let’s try it again

> ~/opt/homebrew/bin/caf -o hello hello.f90
> ~/opt/homebrew/bin/cafrun -np 2 ./hello
 Hello world from            1 of           2
 Hello world from            2 of           2

Success! But hey! Not so fast! Let’s try initial sample code from the issue submitted on StackOverflow.

program Main
    implicit none

    type :: Array_Type
        double precision, dimension(:), allocatable :: values
    end type
    type(Array_Type), codimension[*] :: array

    sync all

    print *, this_image(), array[1]%values(:)
    sync all
end program

Save it inside ~/tmp/coarrays_error.f90.

> ~/opt/homebrew/bin/caf -o coarrays_error coarrays_error.f90
> ~/opt/homebrew/bin/cafrun -np 2 ./coarrays_error
           1  -1.4916681462400413E-154  -1.4916681462400413E-154
           2  -1.4916681462400413E-154  -1.4916681462400413E-154
[pi:54179] *** An error occurred in MPI_Win_detach
[pi:54179] *** reported by process [2740256769,1]
[pi:54179] *** on win rdma window 5
[pi:54179] *** MPI_ERR_OTHER: known error not in list
[pi:54179] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[pi:54179] ***    and potentially your MPI job)
Error: Command:
   `~/opt/homebrew/bin/mpiexec -n 2 ./coarrays_error`
failed to run.

Ups! Note that this error doesn’t exist in case you use MPICH.

I tried to apply approach presented here: https://github.com/sourceryinstitute/OpenCoarrays/issues/625, but I failed :( At some point I am getting compilation error

./bin/brew install --build-from-source --cc=gcc-9 opencoarrays
Warning: You passed `--cc=gcc-9`.
You will encounter build failures with some formulae.
Please create pull requests instead of asking for help on Homebrew's GitHub,
Discourse, Twitter or IRC. You are responsible for resolving any issues you
experience while you are running this unsupported configuration.
                 from /Applications/Xcode.app/.../Headers/OSServices.h:29,
                 from /Applications/Xcode.app/.../Headers/IconsCore.h:23,
                 from /Applications/Xcode.app/.../Headers/LaunchServices.h:23,
                 from /Applications/Xcode.app/.../Headers/CoreServices.h:39,
                 from gutils.c:127:
/Applications/Xcode.app/.../Headers/Authorization.h:193:7: error: variably modified 'bytes' at file scope
  193 |  char bytes[kAuthorizationExternalFormLength];
      |       ^~~~~
make[6]: *** [libglib_2_0_la-gutils.lo] Error 1
make[5]: *** [all-recursive] Error 1
make[4]: *** [all] Error 2
make[3]: *** [all-recursive] Error 1
make[2]: *** [all] Error 2
make[1]: *** [all-recursive] Error 1
make: *** [all] Error 2

Do not report this issue to Homebrew/brew or Homebrew/core!

I guess the reason here is, I am trying to use incorrect version of gcc.

> ./bin/gcc-9 --version
gcc-9 (Homebrew GCC 9.2.0_3) 9.2.0

Unfortunately, it’s not possible to install mpich with gcc@8 because it requires gcc@9 and the loop goes on :(

> ./bin/brew install gcc@8
> ./bin/brew install mpich
==> Installing dependencies for mpich: gcc
==> Installing mpich dependency: gcc
==> Downloading https://ftp.gnu.org/gnu/gcc/gcc-9.2.0/gcc-9.2.0.tar.xz

Please note that I am not quite a fan of homebrew. When it comes to me, I still prefer to build things by myself: Building OpenCoarrays on macOS – everything from the sources. However, I really appreciate the existence of that tool. It provides a huge help for all those people who don’t like to play with sources.


Learning coarrys on macOS using Docker based installation

Installation of the whole toolchain that allows you to play with coarrays might be challenging. It requires lots of small steps, and at least sort of proficiency when it comes to installation based on source code. Take a look here if you are brave enough to install everything from sources: Building OpenCoarrays on macOS – everything from the sources.

If you are not brave enough, or if you don’t have few hours to spare, you can use Docker based installation. Note that it’s not a solution for running codes natively on macOS. It’s rather sort of Fortran sandbox for fooling around when it comes to coarray based coding.

First of all, you need Docker.

You can install it from here: Download Docker. Then, you have to clone the repository: https://github.com/mkowsiak/coarrays-docker, create Docker container and start it.

> git clone https://github.com/mkowsiak/coarrays-docker.git
> cd coarrays-docker
> docker build -t coarrays .
> docker run -i -t coarrays

Building OpenCoarrays on macOS – everything from the sources

There are quite a lot of perquisites for this article, but there are two, most important ones. You will need: huge cup of coffee/tea and something to do between various compilation steps (good book would be handy, maybe some movie on Netflix).

There are few steps you have to do, if you want to have everything built from the sources.


1. GCC

This part of tutorial is heavily based on this article – in fact I have stolen the whole idea: https://solarianprogrammer.com/2019/10/12/compiling-gcc-macos/. However, I have changed it a little bit to make all the places local – I want things to be installed inside my $HOME/opt. If you prefer having gcc installed system wide, proceed with the steps above.

First of all, make sure to create place where all the sources and all the compiled codes will go. Note that in case of macOS I am strongly against installing stuff inside /usr. First of all, you have no idea when and what Apple decides to do with central locations – they can protect them, remove, change name, do whatever they want. Second thing is that I prefer to keep all the things close to my $HOME so I can easily move things around and pick arbitrary location for the stuff I build.

> mkdir -p $HOME/opt/usr/local
> mkdir -p $HOME/opt/src

Download gcc.

> cd $HOME/opt/src
> mkdir gcc
> cd gcc
> curl -L -O https://ftpmirror.gnu.org/gcc/gcc-9.2.0/gcc-9.2.0.tar.xz
> tar zxf gcc-9.2.0.tar.xz

Get all the dependencies.

> cd gcc-9.2.0
> contrib/download_prerequisites

Now, this is the step I have stolen from the page mentioned above. We have to create artificial system root. The reason here is that Apple removed /usr/include and moved it inside XCode based platforms. Note that I prefer to create everything inside my $HOME/opt.

> mkdir -p $HOME/opt/usr/local/gcc_system_root
> cd $HOME/opt/usr/local/gcc_system_root
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/Library .
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System .
> mkdir usr && cd usr
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/bin .
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib .
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/libexec .
> ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/share .
> cp -r /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include include

Now, we have to alter the file Availability.h. Apply this patch (save it inside your $HOME/Availability.h.patch) (you can download it here).

--- /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include/Availability.h	2019-11-07 07:07:28.000000000 +0100
+++ Availability.h	2020-01-18 12:17:58.000000000 +0100
@@ -307,6 +307,12 @@
     #define __OSX_AVAILABLE_BUT_DEPRECATED_MSG(_osxIntro, _osxDep, _iosIntro, _iosDep, _msg)

+    #define __OSX_AVAILABLE_STARTING(_osx, _ios)
+    #define __OSX_AVAILABLE_BUT_DEPRECATED(_osxIntro, _osxDep, _iosIntro, _iosDep)
+    #define __OSX_AVAILABLE_BUT_DEPRECATED_MSG(_osxIntro, _osxDep, _iosIntro, _iosDep, _msg)

 #if defined(__has_feature)
   #if __has_feature(attribute_availability_with_message)

apply it following way

> patch -b $HOME/opt/usr/local/gcc_system_root/usr/include/Availability.h $HOME/Availability.h.patch

Now, we can configure the build of gcc.

> cd $HOME/opt/src/gcc/gcc-9.2.0
> mkdir build && cd build
> ../configure --prefix=$HOME/opt/usr/local/gcc-9.2.0 \
               --enable-checking=release \
               --enable-languages=c,c++,fortran \
               --disable-multilib \
               --with-sysroot=$HOME/opt/usr/local/gcc_system_root \

And, once it’s finished, we can build it and install it

> make
> make install

Eventually, you can test it

> cd $HOME
> echo '#include <stdio.h> \nvoid main() { printf("Hello from gcc-9.2.0\\n"); }' \
$HOME/opt/usr/local/gcc-9.2.0/bin/g++-9.2.0 -x c -o hello -; ./hello

If you want, you can update your PATH variable such way it points to this location

export PATH=$HOME/opt/usr/local/gcc-9.2.0/bin:${PATH}



You need MPICH for OpenCoarrays. Note that MPICH serves as transport layer for OpenCoarrays project (it is said to work with OpenMPI as well).

> mkdir -p $HOME/opt/src/mpich
> cd $HOME/opt/src/mpich
> curl -L -O https://www.mpich.org/static/downloads/3.3.2/mpich-3.3.2.tar.gz
> tar zxf mpich-3.3.1.tar.gz
> cd mpich-3.3.1
> export CC=$HOME/opt/usr/local/gcc-9.2.0/bin/gcc-9.2.0
> export CXX=$HOME/opt/usr/local/gcc-9.2.0/bin/g++-9.2.0
> export FC=$HOME/opt/usr/local/gcc-9.2.0/bin/gfortran-9.2.0
> export FC77=$HOME/opt/usr/local/gcc-9.2.0/bin/gfortran-9.2.0
> ./configure --prefix=$HOME/opt/usr/local/mpich-3.3.1
> make
> make install

Once you have it installed, you can test it with simple code.

program main
  use mpi

  integer error, id, p

  call MPI_Init ( error )
  call MPI_Comm_size ( MPI_COMM_WORLD, p, error )
  call MPI_Comm_rank ( MPI_COMM_WORLD, id, error )
  write (*,*) 'Hello: ', id, '/', p
  call MPI_Finalize ( error )

Save it inside $HOME/tmp/hello.f90 and build it.

> cd $HOME/tmp
> $HOME/opt/usr/local/mpich-3.3.1/bin/mpif90 -o hello hello.f90
> $HOME/opt/usr/local/mpich-3.3.1/bin/mpirun -np 4 ./hello
 Hello:            0 /           4
 Hello:            1 /           4
 Hello:            3 /           4
 Hello:            2 /           4

You are done with MPICH. We can now move to OpenMPI.


3. OpenMPI

OpenCoarrays can also be based on OpenMPI as transport layer. Let’s build OpenMPI

> mkdir -p $HOME/opt/src/openmpi
> cd $HOME/opt/src/openmpi
> curl -L -O https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.2.tar.gz
> tar zxf openmpi-4.0.2.tar.gz
> cd openmpi-4.0.2
> export CC=$HOME/opt/usr/local/gcc-9.2.0/bin/gcc-9.2.0
> export CXX=$HOME/opt/usr/local/gcc-9.2.0/bin/g++-9.2.0
> export FC=$HOME/opt/usr/local/gcc-9.2.0/bin/gfortran-9.2.0
> export FC77=$HOME/opt/usr/local/gcc-9.2.0/bin/gfortran-9.2.0
> ./configure --prefix=$HOME/opt/usr/local/openmpi-4.0.2
> make
> make install

Once you have OpenMPI installed, you can test it with simple code.

program main
  use mpi

  integer error, id, p

  call MPI_Init ( error )
  call MPI_Comm_size ( MPI_COMM_WORLD, p, error )
  call MPI_Comm_rank ( MPI_COMM_WORLD, id, error )
  write (*,*) 'Hello: ', id, '/', p
  call MPI_Finalize ( error )

Save it inside $HOME/tmp/hello.f90 and build it.

> cd $HOME/tmp
> $HOME/opt/usr/local/openmpi-4.0.2/bin/mpif90 -o hello hello.f90
> $HOME/opt/usr/local/openmpi-4.0.2/bin/mpirun -np 4 ./hello
 Hello:            0 /           4
 Hello:            1 /           4
 Hello:            3 /           4
 Hello:            2 /           4

You are done with OpenMPI. We can now move to CMake – it is needed by OpenCoarrays.


4. CMake

You need CMake in order to build OpenCoarrays. And, again, I prefer to have it installed my way.

> mkdir -p $HOME/opt/src/cmake
> cd $HOME/opt/src/cmake
> curl -o https://github.com/Kitware/CMake/releases/download/v3.16.1/cmake-3.16.1.tar.gz
> tar zxf cmake-3.16.1.tar.gz
> cd cmake-3.16.1
> ./bootstrap --prefix=$HOME/opt/usr/local/cmake-3.16.1
> make
> make install

You can test whether it works following way

> $HOME/opt/usr/local/cmake-3.16.1/bin/cmake --version
cmake version 3.16.1

CMake suite maintained and supported by Kitware (kitware.com/cmake).


5. OpenCoarrays – MPICH flavour

Now, let’s install OpenCoarrays witm MPICH as a transport layer.

> mkdir -p $HOME/opt/usr/local/OpenCoarrays
> cd $HOME/opt/usr/local/OpenCoarrays
> curl -L -O \
> tar zxf OpenCoarrays-2.8.0.tar.gz
> cd OpenCoarrays-2.8.0
> mkdir opencoarrays-build-mpich
> cd opencoarrays-build-mpich
> $HOME/opt/usr/local/cmake-3.16.1/bin/cmake $HOME/opt/src/OpenCoarrays-2.8.0 \
-DMPI_Fortran_COMPILER=$HOME/opt/usr/local/mpich-3.3.1/bin/mpif90 \
-DMPI_C_COMPILER=$HOME/opt/usr/local/mpich-3.3.1/bin/mpicc \
-DMPIEXEC_EXECUTABLE=$HOME/opt/usr/local/mpich-3.3.1/bin/mpiexec \
> make
> make test
> make install


6. OpenCoarrays – OpenMPI flavour

Now, let’s install OpenCoarrays witm OpenMPI as a transport layer.

> mkdir -p $HOME/opt/usr/local/OpenCoarrays
> cd $HOME/opt/usr/local/OpenCoarrays/OpenCoarrays-2.8.0
> mkdir opencoarrays-build-openmpi
> cd opencoarrays-build-openmpi
> $HOME/opt/usr/local/cmake-3.16.1/bin/cmake $HOME/opt/src/OpenCoarrays-2.8.0 \
-DMPI_Fortran_COMPILER=$HOME/opt/usr/local/openmpi-4.0.2/bin/mpif90 \
-DMPI_C_COMPILER=$HOME/opt/usr/local/openmpi-4.0.2/bin/mpicc \
-DMPIEXEC_EXECUTABLE=$HOME/opt/usr/local/openmpi-4.0.2/bin/mpiexec \
> make
> make test
> make install


7. First coarray Fortran code

The time is now :) You can run your very first coarray based code.

program hello_world
implicit none
  write(*,*) 'Hello world from ', &
   this_image() , 'of', num_images()
end program hello_world

Save this file inside $HOME/tmp/hello_world.f90.

Let’s run it with MPICH based OpenCoarrays

> cd $HOME/tmp
> $HOME/opt/usr/local/OpenCoarrays-2.8.0-mpich-3.3.1/bin/caf -o hello_world hello_world.f90
> $HOME/opt/usr/local/OpenCoarrays-2.8.0-mpich-3.3.1/bin/cafrun -np 2 ./hello_world
 Hello world from            1 of           2
 Hello world from            2 of           2

Let’s run it with MPICH based OpenCoarrays

> cd $HOME/tmp
> $HOME/opt/usr/local/OpenCoarrays-2.8.0-openmpi-4.0.2/bin/caf -o hello_world hello_world.f90
> $HOME/opt/usr/local/OpenCoarrays-2.8.0-openmpi-4.0.2/bin/cafrun -np 2 ./hello_world
 Hello world from            1 of           2
 Hello world from            2 of           2

You did it!! You have macOS based environment for coarray based development in Fortran :)

Passing string as #define and resolving it as string

Sometimes, it’s required to pass string that will be passed as preprocessor variable so we can make some choices during compilation phase. If you try to pass string as preprocessor variable, and will try to use it inside C string literals, it will fail.

#include <stdio.h>

#ifndef VALUE
#error VALUE is not defined

int main(int argc, char **argv) {

  const char* str = "VALUE";
  printf("%s\n", str);
  return 0;


and, of course, I learned that the hard way. It’s simple, string literals are not treated by preprocessor.

> gcc -DVALUE=hi_there -o main ./main.c
> ./main

The only way to solve it is to use Stringification.

What you eventually have to do to make it work, is following

#include <stdio.h>

#ifndef VALUE
#error VALUE is not defined

#define string_value_(x) #x
#define string_value(x) string_value_(x)

int main(int argc, char **argv) {

  const char* str = string_value(VALUE);
  printf("%s\n", str);
  return 0;


and you can get what you wanted

> gcc -DVALUE=hi_there -o main ./main.c
> ./main

This one will still fail for values with :. E.g.: http://my.address will produce

> gcc -DVALUE=http://my.address -o main ./main.c
> ./main

in order to fix that, we have to change a little bit macro, and the way we are passing variable

#include <stdio.h>

#ifndef VALUE
#error VALUE is not defined

#define string_value_(x) x
#define string_value(x) string_value_(x)

int main(int argc, char **argv) {

  const char* str = string_value(VALUE);
  printf("%s\n", str);
  return 0;


and we have to make sure to pass variable surrounded with \".

> gcc -DVALUE=\"http://my.address\" -o main ./main.c
> ./main

X2Go issue on macOS Catalina – X2Go can’t start XQuartz

If you are running X2Go and you get following error

It means something is wrong with your XQuartz installation. First of all, it might be you haven’t installed XQuartz at all. In that case, take a look here: https://www.xquartz.org/releases/.

Another issue might be related to some dangling connections and in fact, issue is somewhere else than missing XQuartz. If you open Console application, it might be you will find crash report for X11.bin application. Something like this

Process:               X11.bin [1603]
Path:                  /Applications/Utilities/XQuartz.app/Contents/MacOS/X11.bin
Identifier:            org.macosforge.xquartz.X11
Version:               2.7.11 (2.7.112)
Code Type:             X86-64 (Native)
Parent Process:        x2goclient [1599]
Responsible:           x2goclient [1599]
User ID:               501

Date/Time:             2020-01-16 11:15:33.932 +0100
OS Version:            Mac OS X 10.15.2 (19C57)
Report Version:        12
Bridge OS Version:     4.2 (17P2551)
Anonymous UUID:        3B3D6D71-2262-1293-1753-FEEB93B6A0D3

Time Awake Since Boot: 12000 seconds

System Integrity Protection: enabled

Crashed Thread:        5

Exception Type:        EXC_CRASH (SIGABRT)
Exception Codes:       0x0000000000000000, 0x0000000000000000
Exception Note:        EXC_CORPSE_NOTIFY

Application Specific Information:
X.Org X Server 1.18.4 Build Date: 20161025
Cannot establish any listening sockets - Make sure an X server isn't already running
abort() called

It means that you have to clean up X11's lock files and sockets. Simply remove

> sudo rm -rf /tmp/.X11-unix

Some people advice to reinstall different version of XQuartz – e.g.: XQuartz-2.7.7. Well, it will look like reinstalling older version solved your problem, but in fact, you simply applied solution that works by accident. It’s simple, when you install XQuartz it cleans stuff and you get the fresh start.

Embedding JVM inside macOS application bundle using XCode

This time, I am showing you how to embed JVM inside macOS application bundle. In this 5 minutes long video I am guiding you step by step how to create application that will contain JVM. Please note that various JVM releases (depending on vendor) may have different requirements and limitations when it comes to embedding. In this video I am using OpenJDK.

You need no maven to run JUnit 5 tests

Whenever I see presentation, article, tutorial, related to JUnit 5 based tests there is Maven as well. I am not quite a fan of Maven. Anyway, I think it’s worth remembering that for running tests you don’t need it. You don’t have to create complex pom.xml file. You can resort to just few lines of code. All you have to do, is to use make.

Let’s say I have following structure of directory

|-- Makefile
`-- src
    |-- main
    |   `-- sample
    |       `-- Main.java
    `-- test
        `-- sample
            `-- MainTest.java

and my files look like this

package sample;

public class Main {
  public int add(int a, int b) {
    return a + b;

I want to run test that will make sure it’s valid. I can prepare something like this

package sample;

import static org.junit.jupiter.api.Assertions.assertEquals;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;

class MainTest {

  @DisplayName("2 + 2 = 4")
  void twoPlusTwoEqualsFour() {
    Main m = new Main();
    assertEquals(4, m.add(2,2), "Something went wrong!");


I will also need very simple Makefile

assert = $(if $2,$(if $1,,$(error $2)))

JUnit.ver  = 1.5.2
JUnit.jar  = junit-platform-console-standalone-$(JUnit.ver).jar
Maven.http = http://central.maven.org/maven2/org/junit/platform/junit-platform-console-standalone
JUnit.mvn  = $(Maven.http)/$(JUnit.ver)/$(JUnit.jar)

all: check-variable test

	$(call assert,$(JAVA_HOME),JAVA_HOME is not defined)

	-mkdir -p target
	-mkdir -p lib

compile-java: dirs
	$(JAVA_HOME)/bin/javac -d target/classes src/main/java/sample/*.java

	curl -s -z lib/$(JUnit.jar) \
          -o lib/$(JUnit.jar) \

compile-test: compile-java junit-download
	$(JAVA_HOME)/bin/javac -d target/test-classes \
          -cp lib/$(JUnit.jar):target/classes \

test: compile-test
	$(JAVA_HOME)/bin/java -jar lib/$(JUnit.jar) \
           --class-path target/classes:target/test-classes \

	-rm -rf lib
	-rm -rf target

and then, I can simply run make

> make
Thanks for using JUnit! Support its development at https://junit.org/sponsoring

├─ JUnit Jupiter ✔
│  └─ MainTest ✔
│     └─ 2 + 2 = 4 ✔
└─ JUnit Vintage ✔

Test run finished after 153 ms
[         3 containers found      ]
[         0 containers skipped    ]
[         3 containers started    ]
[         0 containers aborted    ]
[         3 containers successful ]
[         0 containers failed     ]
[         1 tests found           ]
[         0 tests skipped         ]
[         1 tests started         ]
[         0 tests aborted         ]
[         1 tests successful      ]
[         0 tests failed          ]

You can download sample code here

Google Search Console, site map file and WordPress

If you want to generate site-map of your WordPress based blog it’s simple and hard at the same time.

It’s hard because it’s problematic to find good plugin that can get site-map for you.

It’s simple, because you can do it without any plugin at all.

You will need: search being enabled in your blog, some time to find the oldest search result, small helper script.

First of all, start with pressing enter inside your Search field. This will give you page with search results. Something like this.

Note the address in the address bar. It should resemble something like this: http://www.owsiak.org/?s=.

Now, go ahead and click older posts as long as there are any. It will take some time, but you can see how address bar changes it’s value. It will be something like this: http://www.owsiak.org/page/2/?s, http://www.owsiak.org/page/3/?s, etc. In my case, there are 34 pages in total. So, it means, we have to query the blog 34 times for the search results.

Now, let’s automate it

# -- 8< -- cut here -- script.sh -- cut here -- 8< --


for i in `seq 1  34` ; do
  curl http://www.owsiak.org/page/${i}/?s -o page_${i}

# -- 8< -- cut here -- cut here -- cut here -- 8< --

and you can run it. Of course, make sure to adapt the script for your needs.

> chmod +x script.sh
> ./script.sh

You will end up with lots of partial results inside files page_1, …, page_34. These are your search results for your blog.

Now, let’s extract the content. Please note that this part may depend on your blog template. So, make sure to properly extract info from the files.

> cat page_* | grep "<li>" | cut -f2 -d"=" | cut -f2 -d'"' > site-map.txt
> tail site-map.txt | sed 's/^/- /'
- http://www.owsiak.org/redirecting-http-traffic/
- http://www.owsiak.org/mr-robot-some-thoughts-about-bugs/
- http://www.owsiak.org/coding-apprentice-todos/
- http://www.owsiak.org/if-you-have-makefile-that-has-weird-name-but-you-still-want-syntax-highlighting/
- http://www.owsiak.org/makefile-all-you-wanted-to-know-about-variable-but-were-afraid-to-ask/
- http://www.owsiak.org/makefile-know-your-location/
- http://www.owsiak.org/scratch-programming-playground-by-al-sweigart/
- http://www.owsiak.org/two-line-prompt-with-colors-and-gadgets/
- http://www.owsiak.org/mwm-confing-for-people-who-like-minimalism-during-remote-work/
- http://www.owsiak.org/makefile-that-calls-itself-without-hardcoding-file-name/

Once you have your site-map.txt you can upload it to your blog page and then, add it inside Google Search Console.

That’s it.

LEUCHTTURM1917 vs. Moleskine – ultimate, final battle ;)

So, you are thinking which calendar should you pick for a coming year.

In my case, I was always struggling whether I should pick Leuchtturm1917 or whether I should stick to Moleskine. I admit, this review is little bit biassed as I use Moleskine calendars for quite some time already. However, it has some objective approach as well. When it comes to notebooks, I definitely prefer Leuchtturm1917 (especially one with red dots) over Moleskine (blank).

Anyway, I will try to give you the feeling of what to expect when it comes to both calendars. I will try to compare them “section” by “section” – if I can say that.



Leuchtturm1917 (at the bottom) gives you definitely more space to write at comparing to Moleskine. Both are quite similar in size, yet, Leuchtturm1917 is wider.

Note that Moleskine comes with plain covers as well. Alice themed one is my arbitrary choice.


About yourself


Well, to be honest, I have no idea whether anybody fills this one. In my 10 years experience with Moleskine, I don’t recall a single year when I have filled all the info about myself. The only thing I fill is probably e-mail – just in case I loose my calendar during the trip.


This section is not available in Leuchtturm1917


Whole year calendar – pocket style


Moleskine gives you overview of current and the next year (in this case it’s 2020 and 2021).


Leuchtturm1917 covers three years, past, current and the next (in this case it’s 2019, 2020, and 2021).

Due to the fact Leuchtturm1917 is little bit wider it gives (in my opinion) better layout – there is more air between elements on the page.


Whole month calendar – summary of each month


In here, Moleskine is definitely way better organised. You get page per month, with each and every day represented, clearly, as one box.


Leuchtturm1917 has slightly different approach. Each day is represented by single line, and you have three months per page. In case you like to put multiple events per day, it might be a struggle to maintain this information with this layout. You also have clearly marked days with Full Moon. In Moleskine, this info is placed both in the monthly calendar and inside daily agenda as well.

There is, however, one advantage of Leuchtturm1917 here – it covers current and next year.


International holidays


44 countries


58 countries


Project plan


This section is not available in Moleskine


This section is Leuchtturm1917 specific. What you get here is a sort of Gantt chart template where you can easily mark your progress. I am not sure whether there is any chance to manage project using this kind of chart on paper, yet still, you can put some mile stones here to have sort of overview of the whole year.


Time zones


This one is quite helpful in case you work with people coming from multiple time zone areas. You can have quick look here and you already know what to expect when it comes to time at given location. However, this might be tricky anyway. Taking into account summer time, winter time, different periods of applying them over the year, depending on region, I guess it’s safer to check the time on the Internet (e.g. https://www.timeanddate.com/time/map/).


This section is not available in Leuchtturm1917


Measures and conversions


It comes handy in all these small situations when you have to jump from imperial system to metric one. I guess, anybody travelling to U.S. faced this issue at least once – “- What the heck this gallon is and what does it mean it’s 77 degrees out there?”.

just to give you the feeling of the whole idea


This section is not available in Leuchtturm1917


International sizes


If you are lost in all these various sizes of various types of clothes, this section is to the rescue :) It’s divided into female/male sections.


This section is not available in Leuchtturm1917


Dialing codes


If you are kind of spy person, trying to reach some esoteric country via landline from the booth in the hotel lobby, you can use this page to properly say: “- Operator, please connect me with …”. Also, quite useful if you are abroad and you have no idea how to prefix TAXI number with country code.


This section is not available in Leuchtturm1917


Travel planning


To be honest, I don’t use this one. I plan all my trips in the calendar that covers overview of the whole year, instead. But still, if you have plenty of trips, you can use this section to track them.


This section is not available in Leuchtturm1917


My extraordinary moments this year


Ok. Let me be straight here. I have no idea what’s the purpose of this page :) Really! I have never, ever, filled this one. I guess, it’s targeting sort of artistically gifted extraverts who want to somehow collect all these small, precious moments in their lives. Nope, don’t count me in here. Anyway, it’s one more place where you can stash some of your personal experience.


This section is not available in Leuchtturm1917


My blank space


Yet another place for you creativity. I guess, it sounds better than – “this page intentionally left blank”.


This section is not available in Leuchtturm1917


My inspiring journeys


This is, basically, dotted map of whole world. I guess it’s something nice for people who travel a lot and want to mark each and every place they have visited.


This section is not available in Leuchtturm1917


Ruler and legend


There are two small things that come handy. Ruler is quite useful in case you want to measure something quickly (e.g. size of the envelop while trying to buy postage stamp). Legend gives you brief overview of what to expect on pages when you see the symbol (it is related to daily agenda).


This section is not available in Leuchtturm1917


Daily agenda – good things come to those who wait


Moleskine has very simple layout of daily agenda. The day starts at 8 a.m. and ends at 8 p.m. (you have two lines of text per each hour). At the top of the page you have the list of all the countries with public holidays at that day. It’s really helpful in case you work as part of the international team.

Daily agenda contains also major phases of the moon and provides the place to mark weather conditions and temperature. You can also find number of the week within the year (at the bottom of the page).


Leuchtturm1917 provides slightly different layout. The day starts at 7 a.m. and ends at 10 p.m. (you have one line of text per each hour). There is a clear indication of the week within the year (at the bottom of the page). Each page is divided into three sections. Hours, notes and small calendar with previous, current and next month. You also have clean indication of current week within the small calendar. The element, I miss most, is the list of countries with public holidays. It forces you to jump back and forth from the daily agenda to the page with the list of holidays. It might be a struggle whenever you try to plan the meeting in multi cultural team.



Both calendars are made of high quality paper. I, personally, prefer one that is available in Leuchtturm1917. In my subjective opinion it’s little bit smoother and goes very well with fountain pen.



I already told you (in the beginning) that this comparison is probably quite biased. I am simply accustomed to Moleskine’s layout and I find it more suitable for my needs. I don’t make too many separate notes per day, thus, I find it a waste of space in case of Leuchtturm1917. On the other hand, I think that Moleskine is packed with antiquated layouts – like personal info, some pages that are obsolete for most of the users (e.g. traveling experiences, etc.).

Anyway, summarising, it looks like I will not switch to Leuchtturm1917 this year. As for you, please decide for yourself :)

Instrumenting JNI based code using Instruments.app

This time, I will sneak peek memory allocations that are leaking from within JNI based code.

Using XCode for JNI development (Objective-C)

There is a new video at my YouTube channel – JNI Cookbook. This time, I am focusing on XCode and JNI based development. I am setting up JNI based code (written in Objective-C) that spawns new JVM and calls custom class through it.

Running JNI based code inside XCode

Sometimes you want to run JNI based code inside XCode – e.g. in case when your C/Objective-C/C++ code is based on some Java library.

It’s both simple and little bit confusing to get JNI running in XCode – you have to make sure to set few things before proceeding.

1. Getting source code

In this article I will use one of mine recipes from JNI Cookbook. Note that source code can be found here: GitHub.

> mkdir -p ~/workspace
> cd ~/workspace
> git clone https://github.com/mkowsiak/jnicookbook

2. Creating new project inside XCode

Start XCode and create new project macOS -> Command Line Tool. Once there, make sure to remove main.m file and replace it with $HOME/workspace/jnicookbook/recipeNo065/c/recipeNo065_main.m. You should have something like this.

3. Adding custom user variable

Make sure to switch to Project -> Build Settings so you can add new, user-defined, variable. I will use JAVA_HOME. It should point to location where your Java is installed. You can get this location by calling java_home

> /usr/libexec/java_home -v 11.0.4

4. Setting up Search Paths

You need to set following search paths: Header Search Path, Library Search Path, and Runpath Search Path. Inside Header Search Path section make sure to add: $(JAVA_HOME)/include/darwin and $(JAVA_HOME)/include. Like this

Inside Library Search Path make sure to set location of libjvm.dylib ($(JAVA_HOME)/lib/server). You can set it following way

Last thing to set here is Runpath Search Path. Make sure to add $(JAVA_HOME)/lib/server there as well

5. Make sure your code is linked with libjvm.dylib

Last thing we have to do (in order to compile the code) is to pass -ljvm to the linker

6. Running the code

Note that JVM uses SIGSEGV to throw exceptions. Make sure to set breakpoint early in your code and ignore SIGSEGV

(lldb) process handle --pass true --stop false --notify true SIGSEGV

7. Continue debugging session

After process handle --pass true --stop false --notify true SIGSEGV is executed inside lldb you can continue with the debugger session.

Setting up googletest on macOS

I am making strong assumption that you have XCode and Command Line Tools installed.

First of all, you need CMake. I prefer installation from the sources, so I go this way.

> mkdir -p $HOME/opt/src
> cd $HOME/opt/src
> curl -o https://github.com/Kitware/CMake/releases/download/v3.16.2/cmake-3.16.2.tar.gz
> tar zxf cmake-3.16.2.tar.gz
> cd cmake-3.16.2
> ./bootstrap --prefix=$HOME/opt/usr/local
> make
> make install

Make sure to have $HOME/opt/usr/local/bin on your PATH.

After you have CMake in place, you can build googletest.

> cd $HOME/opt/src
> git clone https://github.com/google/googletest.git
> cd googletest
> mkdir build
> cd build

# If you don't specify -DBUILD_SHARED_LIBS=ON you will get only static libs (.a)
> make
> make install

And now, you can test the whole thing

// simple.h
int sum(int a, int b);
// simple.cc
int sum(int a, int b) {
  return a + b;
// simple_test.cc
#include "simple.h"
#include "gtest/gtest.h"

TEST(SumTest, Equal) {
  EXPECT_EQ(2, sum(1,1));

Once, everything is in place, we can run the code

> g++ -std=c++11 -o test \
  simple.cc simple_test.cc \ 
  -I. -I$HOME/opt/usr/local/include \
  $HOME/opt/usr/local/lib/libgtest.a \
> ./test
Running main() from $HOME/opt/src/googletest/googletest/src/gtest_main.cc
[==========] Running 1 test from 1 test suite.
[----------] Global test environment set-up.
[----------] 1 test from SumTest
[ RUN      ] SumTest.Equal
[       OK ] SumTest.Equal (0 ms)
[----------] 1 test from SumTest (0 ms total)

[----------] Global test environment tear-down
[==========] 1 test from 1 test suite ran. (0 ms total)
[  PASSED  ] 1 test.

If you want to use libgtest.dylib and libgtest_main.dylib you will have to play a little with with @rpath.

# Make sure that google test itself has proper settings with rpath
> install_name_tool -add_rpath ${HOME}/opt/usr/local/lib \

> install_name_tool -change libgtest.dylib \
    @rpath/libgtest.dylib ${HOME}/opt/usr/local/lib/libgtest_main.dylib

# Compile your code
> g++ -std=c++11 -o test \
  simple.cc simple_test.cc \ 
  -I. -I$HOME/opt/usr/local/include \
  -L$HOME/opt/usr/local/lib -rpath $HOME/opt/usr/local/lib -lgtest -lgtest_main

and now, make sure to update @rpath inside your executable

> install_name_tool -change libgtest.dylib @rpath/libgtest.dylib
> install_name_tool -change libgtest_main.dylib @rpath/libgtest_main.dylib ./test

Alternative approach. Change the id of libs

> install_name_tool -id @rpath/libgtest_main.dylib $HOME/opt/usr/local/lib/libgtest_main.dylib
> install_name_tool -id @rpath/libgtest.dylib $HOME/opt/usr/local/lib/libgtest.dylib

in this case, you don’t have to rename libs inside your final executable.

And all of this started here, with this tutorial session covering googletest basics: Unit testing in CLion

DT_RPATH (ld) & @rpath (dyld)
Shared Libraries: Understanding Dynamic Loading
Fun with rpath, otool, and install_name_tool
Run-Path Dependent Libraries
Overview of Dynamic Libraries

Installing boost at macOS

Installation of Boost at macOS is quite simple.

# get the sources

> mkdir $HOME/opt/src
> cd $HOME/opt/src
> curl -o https://dl.bintray.com/boostorg/release/1.72.0/source/boost_1_72_0.tar.gz
> tar zxf boost_1_72_0.tar.gz
> cd boost_1_72_0

# configure boost

> ./bootstrap.sh --prefix=$HOME/opt/usr/local/boost
> ./b2
> ./b2 install

# make sure to point to this location while compiling your code

> export BOOST_LIB=$HOME/opt/usr/local/boost/lib
> export BOOST_INC=$HOME/opt/usr/local/boost/include

> g++ -o some_code some_code.cc -L${BOOST_LIB} -lboost_system -I${BOOST_INC}

libjansi.jnilib cannot be opened because the developer cannot be verified – macOS 10.15

If you can see this error: “libjansi.jnilib” cannot be opened because the developer cannot be verified – while working on macOS 10.15 (screen shoot looks like this)

it means you probably have issues with libjansi coming from Apache Maven.

Make sure to upgrade to version apache-maven-3.6.2.

macOS 10.15 – directory size in CLI

There is this interesting article about using NCurses Disk Usage in Arch Linux – you can find here: Cleaning root partition on Linux. I definitely prefer to stick to macOS, so I have decided to get it running from iTerm2. It’s really simple. All you have to do is to (as with other samples you can find on my blog) build it from sources.

> mkdir -p ~/opt/src
> cd ~/opt/src
> curl -O https://dev.yorhel.nl/download/ncdu-1.14.1.tar.gz
> tar zxf ncdu-1.14.1.tar.gz
> cd ncdu-1.14.1
> ./configure
> make

that’s it. Now you can benefit from it by calling: $HOME/opt/src/ncdu-1.14.1/ncdu – you can move this file to your ~/bin if you like. It’s up to you.

ncdu 1.14.1 ~ Use the arrow keys to navigate, press ? for help
--- /Users/some_user/opt/src/ncdu-1.14.1 ---------------------------
    1.0 MiB [##########] /src
  196.0 KiB [#         ]  configure
   88.0 KiB [          ]  ncdu
   52.0 KiB [          ]  aclocal.m4
   36.0 KiB [          ]  config.status
   36.0 KiB [          ]  Makefile.in
   32.0 KiB [          ]  Makefile
   32.0 KiB [          ] /deps
   24.0 KiB [          ]  config.log
   24.0 KiB [          ]  depcomp
   20.0 KiB [          ]  ncdu.1
   16.0 KiB [          ]  install-sh
   16.0 KiB [          ] /doc
    8.0 KiB [          ]  compile
    8.0 KiB [          ]  missing
    8.0 KiB [          ]  ChangeLog
    4.0 KiB [          ]  config.h
    4.0 KiB [          ]  config.h.in
    4.0 KiB [          ]  configure.ac
    4.0 KiB [          ]  README
    4.0 KiB [          ]  COPYING
    4.0 KiB [          ]  Makefile.am
    4.0 KiB [          ]  stamp-h1
 Total disk usage:   1.6 MiB  Apparent size:   1.4 MiB  Items: 85

And here you can take a look at how it works

macOS Catalina and VirtualBox issues

I had this strange issue with VirtualBox. There was this crash

Process:               VirtualBoxVM
Path:                  /Applications/VirtualBox.app/Contents/Resources/VirtualBoxVM.app/Contents/MacOS/VirtualBoxVM
Exception Type:        EXC_BAD_ACCESS (SIGSEGV)
Exception Codes:       KERN_INVALID_ADDRESS at 0x000007fc0a5286d0
Exception Note:        EXC_CORPSE_NOTIFY

Termination Signal:    Segmentation fault: 11
Termination Reason:    Namespace SIGNAL, Code 0xb
Terminating Process:   exc handler

Thread 0:: Dispatch queue: com.apple.main-thread
0   org.qt-project.QtGuiVBox         0x000000010fd06b25 0x10f93e000 + 3967781

solution that worked for me follows

VBoxManage setextradata global GUI/HidLedsSync 0

source: https://forums.virtualbox.org/viewtopic.php?f=8&t=95041

macOS Mojave – make sure your system is safe
updated for ClamAV release 0.101.4

If you want to make sure that your macOS High Sierra is clean (when it comes to malicious software) you can use free tool (free as in beer and free as in speech – at the same time) called ClamAV.

You can get it various ways. You can download it’s commercial version from AppStore – as paid release, you can install it using brew, download binary from some place where you have no idea what’s really inside, You can instal macOS Server (ClamAV comes bundled with it), etc.

However, you can also build it by yourself. Directly from sources. It’s a pain in a neck, I know, but you can be sure of what you are actually running. And, you will learn that zlib’s library author is a really brainy person. Go ahead, look for yourself in Wikipedia.

Anyway. Let’s start. Estimated time to complete (depending on your system configuration) – 1h-2h.

I suggest to create some place, where you can put all sources and binaries. I suggest following approach

mkdir -p $HOME/opt/src
mkdir -p $HOME/opt/usr/local

In each step, we will download source codes of given tool into


and then, use

./configure --prefix=$HOME/opt/usr/local/$TOOL_NAME

to install them inside $HOME/opt.

1. You need PCRE – Perl Compatible Regular Expressions

cd $HOME/opt/src
curl -O https://ftp.pcre.org/pub/pcre/pcre2-10.33.tar.gz
tar zxf pcre2-10.33.tar.gz
cd pcre2-10.33
./configure --prefix=$HOME//opt/usr/local/pcre2
make check
make install

2. You need LibreSSL
(special thanks go to: http://www.gctv.ne.jp/~yokota/clamav/). I was always using OpenSSL, but recently I had more and more issues with it while compiling stuff from sources.

cd $HOME/opt/src
curl -O https://ftp.openbsd.org/pub/OpenBSD/LibreSSL/libressl-3.0.0.tar.gz
tar zxf libressl-3.0.0.tar.gz
cd libressl-3.0.0
export CXXFLAGS="-O3"
export CFLAGS="-O3"
./configure --prefix=$HOME/opt/usr/local/libressl
make check
make install

3. You need zlib

cd $HOME/opt/src
curl -O http://zlib.net/zlib-1.2.11.tar.xz
tar zxf zlib-1.2.11.tar.xz
cd zlib-1.2.11
./configure --prefix=$HOME/opt/usr/local/zlib
make install

4. Build the stuff

cd $HOME/opt/src
git clone git://github.com/vrtadmin/clamav-devel
cd clamav-devel
git checkout tags/clamav-0.101.4
# you can also get stable version from here:
# https://www.clamav.net/downloads
export CFLAGS="-O3 -march=nocona"
export CXXFLAGS="-O3 -march=nocona"
export CPPFLAGS="-I$HOME/opt/usr/local/pcre2/include \
  -I$HOME/opt/usr/local/libressl/include \
./configure --prefix=$HOME/opt/usr/local/clamav --build=x86_64-apple-darwin`uname -r` \
  --with-pcre=$HOME/opt/usr/local/pcre2 \
  --with-openssl=$HOME/opt/usr/local/libressl \
  --with-zlib=$HOME/opt/usr/local/zlib \
  --disable-zlib-vcheck \
make install

5. Prepare minimal config file

mkdir $HOME/opt/usr/local/clamav/clamavdb
mkdir $HOME/opt/usr/local/clamav/log/
touch $HOME/opt/usr/local/clamav/etc/freshclam.conf

Add minimal content of freshclam.conf inside $HOME/opt/usr/local/clamav/etc/freshclam.conf. For details, make sure to read freshclam.conf documentation: freshclam.conf

DatabaseDirectory _PUT_YOUR_HOME_LOCATION_HERE_/opt/usr/local/clamav/clamavdb
UpdateLogFile _PUT_YOUR_HOME_LOCATION_HERE_/opt/usr/local/clamav/log/freshclam.log
DatabaseMirror database.clamav.net

6. Make sure to keep your database up to date


6. Now, you can scan your drive for viruses

cd $HOME
$HOME/opt/usr/local/clamav/bin/clamscan -ir $HOME

# if you want to scan your whole drive you need to run the thing as root
# I also suggest to exclude /Volumes, unless you want to scan your TimeMachine
# and all discs attached
# -i - report only infected files
# -r - recursive
# --log=$FILE - store output inside $FILE
# --exclude=$DIR - don't scan directory $DIR
cd $HOME
sudo $HOME/opt/usr/local/clamav/bin/clamscan --log=`pwd`/scan.log --exclude=/Volumes --exclude=/tmp -ir /

_JAVA_OPTIONS will not always work for your Java code

It’s good to know, that _JAVA_OPTIONS will not always work in your Java code. Especially, when you elevate privileges using sudo.

Let’s say we have this simple code

public class Simple {
  public static void main(String [] arg) {
    System.out.println("Hello from Simple!");

and we run it following way

> export _JAVA_OPTIONS="-Xms1G"
> java Simple
Picked up _JAVA_OPTIONS: -Xms1G
Hello from Simple!
> sudo java Simple
Password: 🔑
Hello from Simple!

As you can see, in case of second execution _JAVA_OPTIONS were not picked up. The reason for not picking it up follows

/* ./hotspot/share/runtime/arguments.cpp */ 

// Don't check this environment variable if user has special privileges
// (e.g. unix su command).
if (buffer == NULL || os::have_special_privileges()) {
  return JNI_OK;