Ansible and ‘Error while fetching server API version’

If you are getting this error

'Error while fetching server API version: {0}'.format(e)
DockerException: Error while fetching server API version: 
   ('Connection aborted.', error(2, 'No such file or directory'))

you should know that all you need is a Docker that is running ;)

I love this kind of errors :)

How to solve missing javah in Java 10 – ugly way

In Java 10, there is no longer javah tool. It means, that you can’t extract information regarding native interfaces easily. It’s not that simple to generate header files based on compiled – class – files.

If you desperately need to process lots of class files, you can alway hack through using javap tool. You can create “reduced” source files that will be more than enough for javac to generate headers. It’s not ideal, but in case of “it should be on my desk in 10 minutes” like tasks, it can be your day saver.

Have fun!


# FIRST_ARG - full class name (with package)
# Note! I have a strong assumption that:
#  - native method is declared inside class that is part of package
#  - native method is not declared inside class that is inner class
# SECOND_ARG - class path

CLASS_NAME=`javap -cp $2 $1 | \
  grep -v "Compiled from" | \
  grep "public class" | \
  cut -f3 -d" " | \
  awk -F"." '{ print $NF }'`

PACKAGE_NAME=`javap -cp $2 $1 |i \
  grep -v "Compiled from" | \
  grep "public class" | \
  cut -f3 -d" " | \
  sed s/\.${CLASS_NAME}$//`

DIR_NAME=`echo $PACKAGE_NAME | sed 's|\.|/|g'`
mkdir -p java_jni/${DIR_NAME}


# There are lots of strong assumptions here !!
echo "package ${PACKAGE_NAME};" > ${JAVA_FILE}
echo "public class ${CLASS_NAME} {" >> ${JAVA_FILE}
javap -cp $2 $1 | grep "native" >> ${JAVA_FILE}
echo "}" >> ${JAVA_FILE}

# Now, we can generate header
mkdir -p c_header
javac -h c_header ${JAVA_FILE}

Java 10 and JNI Cookbook updates

Eventually, the time for updates has come. I had to update all the codes (Makefiles, to be precise) to get JNI Cookbook samples working with most recent releases of the tool chain.

Java 10 and first surprise with JNI

Well, you will be surprised with Java 10 and JNI. They have warned – that’s true, but it escalated quickly ;)

> javah -jni -d c -cp target recipeNo001.HelloWorld
javah: No such file or directory

you have to switch to javac and use it’s new feature of generating JNI headers directly from java files instead of class files

> javac -h c -d target java/recipeNo001/

In fact, the whole process is much simpler now as you don’t need intermediate step with javah.

There is one more issue here. New approach not always mean better one. It looks like this change may affect people who don’t have access to source codes: Generate JNI header files for class files in JDK 10. You can get it, sort of, fixed by decompiling class files: How to solve missing javah in Java 10 – ugly way

Formatting Java code

In the past I was using Eclipse as a tool for Java formatting. Eclipse allows you to call formatter from CLI which makes it super convenient to format lots of files easily. You can find description here.

But, I think that Eclipse’s time approached. I have found something called: Google Java Format. It’s very handy when it comes to quick formatting of the code. Take a look here.

/* */
package recipeNo004;

public class PassLong {

        /* This is the native method we want to call */
             public static native void displayLong(long value);

        /* Inside static block we will load shared library */
  static {

  public static void main(String[] args) {
/* This message will help you determine whether
                                LD_LIBRARY_PATH is correctly set
      println("library: "
                        + System.getProperty(

                long value = 1234;

                /* Call to shared library */
    PassLong.displayLong(         value        );

All you have to do, is to call it like this:

java -jar google-java-format-1.5-all-deps.jar

and here you go. Nicely formatted, beautiful, Java code.

/* */
package recipeNo004;

public class PassLong {

  /* This is the native method we want to call */
  public static native void displayLong(long value);

  /* Inside static block we will load shared library */
  static {

  public static void main(String[] args) {
    /* This message will help you determine whether
    		LD_LIBRARY_PATH is correctly set
    System.out.println("library: " + System.getProperty("java.library.path"));

    long value = 1234;

    /* Call to shared library */

Screen Flow vs. ffmpeg – converting to animated GIF

From time to time, I need to generate animated GIF. Mostly in cases when I want to share some info on web page or when I want to show some actions to the users. Usually, few steps that has to be done to achieve something. Like “-Click here, click there, here you go. It’s all set”.

Screen Flow is my screen casting weapon of choice. It does job exactly as I need. However, there is one drawback. Whenever I export movies to GIF, they are huge. Take a look below.

I have this simple, short, mp4 movie

And here is GIF made with Screen Flow

frame rate – 15
rgb – 8bits
length – 8 sec

While here, I have another one, made using ffpmeg

frame rate – 15
rgb – 8bits
length – 8 sec

ffmpeg -i movie.mp4 -pix_fmt rgb8 -r 15 image.gif

Of course, you can generate Screen Flow like GIF as well. It’s just a matter of generating palette before proceeding with GIF creation

ffmpeg -y -i movie.mp4 -vf palettegen palette.png
ffmpeg -y -i movie.mp4 -i palette.png -filter_complex paletteuse -r 15 image.gif

Now, if you can sacrifice quality a little bit, I think it’s worth adding ffmpeg into the process and save few bytes in size.

If you want to know how to compile ffpmeg for macOS, take a look here: Awesomenauts and strange issues with exporting replays

– Animation was made based on new game from Animata Design: CHUCHEL

Chuchel – new game from Animata Design

If you are fan of point and click games you can’t miss this one – CHUCHEL.

This is yet another game from well know Animata Design. These guys are the ones behind Smorost, Machinarium, and Botanicula (just to mention few). All these games have few, common, attributes: surrealistic sense of humour, great sound tracks (made by DVA and Floex) and really, really, nice graphics.

If you were fan of games like Goblins, Leisure Suit Lary, King’s Quest this one is something you should show to your kids. No doubt!

There are few things about this game that make it really worth buying.

1. You need just one mouse button during gameplay. You can use mouse like this one:

2. Interaction is super simple – just point and click

3. All messages are presented as pictures – you don’t have to know foreign language. Well, in fact you don’t have to know !any! language.

4. Graphics is really appealing

We have spent (so far) just one hour playing it, and it was fun.

So, if you like games like Machinarium, Botanicula, make sure you won’t miss this one.

And the soundtrack, even though you can feel it is game related, still sounds well. Unfortunately, I think it’s not yet available at DVA’s page on bandcamp, but it is still worth paying for. I have listened to it already twice. It’s quite similar to Botanicula’s one but, at the same time, different!

You have two options to get the game: Steam, and HumbleBundle. I, personally, prefer buying from HumbleBundle – you get it DRM free and you get Steam key anyway.

Now, let me be 100% clear here. If you are a grown up who likes surrealistic games, nice soundtracks and (in addition to that) you have kids ~8 years old – this one is for you.

On the other hand, if you are Counter Strike fan, I guess you will not enjoy this game that much ;)

Anyway, take a look at short gameplay below.

I, personally, like it ;)

Awesomenauts and strange issues
with exporting replays

While exporting gameplay from Awesomenauts, you may experience issues related to architecture used during ffmpeg compilation. Awesomenauts come with precompiled ffmpeg version, but it looks like something went wrong. The results are horrible. No exporting inside macOS High Sierra. It looks like Awesomenauts are distributed with incorrectly built tool (in fact, libs).

dyld: Library not loaded: @loader_path/../../Frameworks/libavdevice.dylib
  Referenced from: ../Library/Application Support/.../ffmpeg-lgpl/./ffmpeg_mac
  Reason: no suitable image found.  Did find:
	.../Steam/..../Frameworks/libavdevice.dylib: mach-o, but wrong architecture
	.../Steam/..../Frameworks/libavdevice.dylib: stat() failed with errno=1
Abort trap: 6

However, there is a rescue. You can compile ffmpeg by yourself!

All you have to do is: clone sources from git, compile the stuff and put symbolic link inside Awesomenauts application. Pretty easy, isn’t it? Just take a look below.

# getting sources
mkdir -p $HOME/opt/src
mkdir -p $HOME/opt/shared
cd $HOME/opt/src
git clone ffmpeg

# configure and make ffmpeg
cd $HOME/opt/src/ffmpeg
./configure --disable-x86asm --prefix=$HOME/opt/shared
make; make install

# create symbolic link
cd $HOME/Library/Application Support/Steam\

# make sure to make a backup
mv ffmpeg_mac ffmpeg_mac~
ln -s $HOME/opt/shared/bin/ffmpeg ffmpeg_mac

# make sure it works
./ffmpeg_mac -version
ffmpeg version N-90232-g0645698ecc Copyright (c) 2000-2018 the FFmpeg developers
built with Apple LLVM version 9.0.0 (clang-900.0.39.2)

Now, you can export gameplay movie directly from Awesomenauts application ;) Now. Let’s take a look at this sweet, Ayla’s, triple kill ;)

Happy sharing!

NX server shutdown

sudo /Applications/ --shutdown
sudo /Applications/ --startmode manual

LaTeX on macOS High Sierra

If you are looking for macOS based setup for LaTeX, here is my advice.

1. Get the LaTeX itself

2. Get the editor

LyX – WYSYWIG editor for LaTeX based documents

Download it here

TextMate + LaTeX bundle

Download TextMate here

and LaTeX bundle can be downloaded here.

After installing bundle, you will be able to use TextMate as LaTeX editor.

Time Machine and file attributes in macOS

In case you want to copy files from Time Machine directory (directly) you will face issues related to files access. You need to do few things.

Let’s say you have directory “backup_from_TM“. After copying it to you $HOME directory you have to

# remove Time Machine's attributes

sudo xattr -r -d ./backup_from_TM
sudo xattr -r -d ./backup_from_TM
sudo xattr -r -d ./backup_from_TM

# remove ACLs

chmod -R -N ./backup_from_TM

# you can always double check what are the ACLs
# by listing them using ls -le command

ls -le ./backup_from_TM

Vim mini tutorial – bookmark

ma - create bookmark
1G - jump to beginning of first line
`a - jump to bookmark

You can create numerous, named, bookmarks inside each file. Use letters a-z.

FreeBSD + SVN + new host IP

Recently, I had “small” issue with my SVN host. I have changed IP address of my NAS server and it made SVN failing to work.

All you have to do is to update this entry inside this file


Make sure you take care of new host address (in case you have specified it in the past)

svnserve_flags="-d --listen-port=3690 --listen-host"

Make sure host address matches your new IP.

bash – getting part of variable (e.g. last 4 characters and prefix)

Recently, I had this small issue: how to split variable in bash without too much effort?

I wanted to split it into two parts: prefix and suffix.

There were few assumptions, when it comes to variable value.

– length of variable is >= 5
– it has fixed length of suffix (4 characters)
– prefixes length can vary

suffix=${variable: -4}
echo "variable: $variable prefix: $prefix suffix $suffix"

macOS High Sierra – make sure your system is safe

If you want to make sure that your macOS High Sierra is clean (when it comes to malicious software) you can use free tool (free as in beer and free as in speech – at the same time) called ClamAV.

You can get it various ways. You can download it’s commercial version from AppStore – as paid release, you can install it using brew, download binary from some place where you have no idea what’s really inside, You can instal macOS Server (ClamAV comes bundled with it), etc.

However, you can also build it by yourself. Directly from sources. It’s a pain in a neck, I know, but you can be sure of what you are actually running. And, you will learn that zlib’s library author is a really brainy person. Go ahead, look for yourself in Wikipedia.

Anyway. Let’s start. Estimated time to complete (depending on your system configuration) – 1h-2h.

I suggest to create some place, where you can put all sources and binaries. I suggest following approach

mkdir -p $HOME/opt/src
mkdir -p $HOME/opt/usr/local

In each step, we will download source codes of given tool into


and then, use

./configure --prefix=$HOME/opt/usr/local/$TOOL_NAME

to install them inside $HOME/opt.

1. You need PCRE – Perl Compatible Regular Expressions

cd $HOME/opt/src
curl -O
tar zxf pcre2-10.30.tar.gz
cd pcre2-10.30
./configure --prefix=$HOME//opt/usr/local/pcre2
make install

# You can also run `make check` before installing PCRE, but you may need to apply path
# source:
# To apply it, simply put patch content inside file RunGrepTest.fix
# --- 8< --- CUT HERE --- 8< --- CUT HERE --- 8< --- CUT HERE --- 8< --- 
--- RunGrepTest	2017-07-18 18:47:56.000000000 +0200
+++ RunGrepTest.fix	2018-01-07 20:00:40.000000000 +0100
@@ -681,7 +681,7 @@
 # works.

 printf "%c--------------------------- Test N7 ------------------------------\r\n" - >>testtrygrep
-if [ `uname` != "SunOS" ] ; then
+if [ `uname` != "Darwin" ] ; then
   printf "abc\0def" >testNinputgrep
   $valgrind $vjs $pcre2grep -na --newline=nul "^(abc|def)" testNinputgrep | sed 's/\x00/ZERO/' >>testtrygrep
   echo "" >>testtrygrep
# --- 8< --- CUT HERE --- 8< --- CUT HERE --- 8< --- CUT HERE --- 8< ---
# and run patch tool
patch -b RunGrepTest RunGrepTest.fix

2. You need recent release of clang and llvm

cd $HOME/opt/src
curl -O
cd $HOME/opt/usr/local
tar xvf $HOME/opt/src/clang+llvm-3.6.2-x86_64-apple-darwin.tar.xz

You can’t use more recent version of Apple’s LLVM :( It means, that you may require two, separate installations of LLVM. Maximum version of LLVN you can use here is 3.6. Contrary, compiling R with OpenMP support will require Version 4.0.1 (take a look here: R 3.4, rJava, macOS and even more mess ;))

3. You need LibreSSL
(special thanks go to: I was always using OpenSSL, but recently I had more and more issues with it while compiling stuff from sources.

cd $HOME/opt/src
curl -O
tar zxf libressl-2.6.4.tar.gz
cd libressl-2.6.4
export CXXFLAGS="-O3"
export CFLAGS="-O3"
./configure --prefix=$HOME/opt/usr/local/libressl
make check
make install

4. You need zlib

cd $HOME/opt/src
curl -O
tar zxf zlib-1.2.11.tar.xz
cd zlib-1.2.11
./configure --prefix=$HOME/opt/usr/local/zlib
make install

5. Build the stuff

cd $HOME/opt/src
git clone git://
cd clamav-devel
git checkout tags/clamav-0.100.0 -b rel/0.100
# you can also get stable version from here:
export CFLAGS="-O3 -march=nocona"
export CXXFLAGS="-O3 -march=nocona"
export CPPFLAGS="-I$HOME/opt/usr/local/pcre2/include \
  -I$HOME/opt/usr/local/libressl/include \
./configure --prefix=$HOME/opt/usr/local/clamav --build=x86_64-apple-darwin`uname -r` \
  --with-pcre=$HOME/opt/usr/local/pcre2 \
  --with-openssl=$HOME/opt/usr/local/libressl \
  --with-zlib=$HOME/opt/usr/local/zlib \
  --disable-zlib-vcheck \
make install

6. Make sure to keep your database up to date


7. Now, you can scan your drive for viruses

cd $HOME
$HOME/opt/usr/local/clamav/bin/clamscan --log=$HOME/scan.log -ir $HOME

# if you want to scan your whole drive you need to run the thing as root
# I also suggest to exclude /Volumes, unless you want to scan your TimeMachine
# and all discs attached
# -i - report only infected files
# -r - recursive
# --log=$FILE - store output inside $FILE
# --exclude=$DIR - don't scan directory $DIR
cd $HOME
sudo $HOME/opt/usr/local/clamav/bin/clamscan --log=`pwd`/scan.log --exclude=/Volumes --exclude=/tmp -ir /

To compile ClamAV on macOS High Sierra I have used my old scripts, but many thanks go to:

macOS High Sierra and Quick Time Player issues

Recently (after upgrading to macOS High Sierra) I have noticed that playing videos has some flaws. There are small glitches in Quick Time Player (or some libs) that makes watching movies really painful.

You can notice these small, short, sound breaks. It’s like somebody is pressing pause button just for audio, while video is still rolling.

So far, I have no idea what is causing this one, but (at least) I have a solution for this situation:

With VLC I can play the very same video material without any issues at all.

Reproducible research

Reproducible research is quite important topic. Once you design, prepare, and run your experiment you should make sure it will be possible to reproduce it in the future. Ideally, anyone should be able to perform exactly the same type of experiment.

Arround 18 years ago, I started to develop: G(enetic) A(lgorithm) B(ack) P(ropagation). At that time, layout of Neural Network (layers, biases, and connections between neurons) was usualy taken as granted. To solve problem using NN you had to either use some structure described in some scientific paper or design it on your own. I have decided to test slightly different approach. I have decided to evolve Neural Networks.

Each Neural Network structure was evolving inside small, isolated, population maintained by Genetic Algorithm. After some period of time, best fitted individuals – ones that could solve the problem most efficiently – had a chance to migrate. This way, best structure for a given problem was growing slowly without any external intervention. Each, evolved, Neural Network was supposed to perform two tasks:

– learn to solve the problem – using input patterns from first set,
– solve the final problems – using input patterns from second.

In a sense, whole process was completely unsupervised. Neural Networks were completely random (at the beginning), and over time the optimal solution was emerging.

Recently, I have decided to check whether whole thing works or not. To my surprise, getting from the archive (where all sources and input files were stored) to running state was really simple task. Of course, it took some time to get familiar with documentation – yet again, to my surprise, it was quite good. It took some time to compile things (even though it worked almost out of the box), and it took some time to set initial parameters for the application. Anyway, what surprised me most was the cost of getting from zero to running application after more than seventeen years! I was able to reuse sample data, I was able to run experiments, and it simply worked as expected! The only difference I have noticed was the time needed to evolve optimal solution – algorithm performed way faster.

That’s what I call research reproducability. When somebody asks me:

“-Can I easily reproduce your experiment?”, I can give firm and confident answer,
“-Yes you can!”.

All I did was keeping close to standards and well established practices.

A few well-chosen test cases and a few print statements in the code may be enough.

Some programs are not handled well by debuggers: multi-process or multi-thread programs, operating systems, and distributed systems must often be debugged by lower-level approaches. In such situations, you’re on your own, without much help besides print statements and your own experience and ability to reason about code.

— The Practice of Programming – Brian W. Kernighan and Rob Pike

Make sure to look here if you are using R for your research: Reproducible Research. You can read a little bit about role of Software Engineers in research: here.

Calling shell process from Groovy script

If you want to run shell script from Groovy, you can easily achieve that using either ProcessBuilder or execute methods.

If you want to access environment variable (created inside Groovy) there are two ways:

– create completely new environment – risky, you can forget some essential variables
– create new environment based on parent’s one – preferred, simply add new variables

In both cases you need to manipulate Map<String, String> in order to add variables into environment.

Let’s say we want to run following code

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE ---


echo "Hello from script"

# variable called "variable" must be defined inside environment
echo $variable

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE ---

we can either use ProcessBuilder

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE --- 

// In this case I use ProcessBuilder class and inheritIO method
def script = "./"
def pb = new ProcessBuilder(script).inheritIO()
def variable = "Variable value"
Map<String, String> env = pb.environment()
env.put( "variable", variable )
Process p = pb.start()

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE ---

or, we can use execute method. It takes two arguments – environment (List) and execution directory name.

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE --- def script = "./" def variable = "Variable value" // we have to create HashMap from HashMap here! // Note the result of method! // // // // "Returns an unmodifiable string map view of the current system environment." // myenv = new HashMap(System.getenv()) myenv.put("variable", variable ) // we have to convert to array before calling execute String[] envarray = myenv.collect { k, v -> "$k=$v" }

def std_out = new StringBuilder()
def std_err = new StringBuilder()

proc = script.execute( envarray, null )

proc.consumeProcessOutput(std_out, std_err)

println std_out

8< --- CUT HERE --- CUT HERE --- CUT HERE --- CUT HERE ---

Fortran and GNU Make

Building binary file based on Fortran code that is organized in tree based source directories may be a struggle. Usually, you want to put all objects inside single directory while, at the same time, you would like to keep sources divided into some logical parts (based on source location and modules). Let’s say you have following source structure.

|-- Makefile
`-- src
    |-- a
    |   |-- a.f90
    |   `-- aa.F90
    |-- b
    |   |-- b.f90
    |   `-- bb.F90
    `-- main.f90

We have to sub-directories (with some logical elements of the code). In addition to that, there is a Makefile that will handle compilation, linking and archiving sources inside libraries. Code from two sub-directories: “a” and “b”, will be packed into liba.a and libb.a respectively. We wanto to do that, as we want to be able to re-use parts of the code somewhere else. In this case, liba.a will contain two modules that can be used in some other project. As for b, that’s not that obvious as it depends on a. Anyway, it’s a good idea to encapsulate parts of the code into some logical elements (libraries). This approach enforces proper API design and makes code more portable.

Now, to make things more complicated, source file a.f90 will declare module called “a_module” and source file aa.F90 will declare module “aa_module”. These modules will be used inside source codes: b.f90 and bb.F90.

Let’s take a look at source codes themselves.

8< - CUT HERE --- CUT HERE -- src/a/a.f90 -- CUT HERE --- CUT HERE --

! Source code of file a.f90
module a_module
    subroutine a
      write (*,*) 'Hello a'
    end subroutine a
end module a_module

8< - CUT HERE --- CUT HERE -- src/a/aa.F90 -- CUT HERE --- CUT HERE -

! Source code of file aa.F90
module aa_module
    subroutine aa
      write (*,*) 'Hello aa'
    end subroutine aa
end module aa_module

8< - CUT HERE --- CUT HERE -- src/b/b.f90 -- CUT HERE --- CUT HERE --

! Source code of file b.f90
subroutine b
  use a_module
  write (*,*) 'Hello b'
  call a
end subroutine b

8< - CUT HERE --- CUT HERE -- src/b/bb.F90 -- CUT HERE --- CUT HERE -

! Source code of file bb.F90
subroutine bb
  use aa_module
  write (*,*) 'Hello bb'
  call aa
end subroutine bb

8< - CUT HERE --- CUT HERE -- src/main.f90 -- CUT HERE --- CUT HERE -

! Source code of file main.f90
program main
  write (*,*) 'Hello main'
  call b
  call bb
end program


All these sources will be compiled using Makefile below. After compilation is done, you will get following structure:

|-- Makefile
|-- bin
|   |-- main
|   `-- main_lib
|-- include
|   |-- a_module.mod
|   `-- aa_module.mod
|-- lib
|   |-- liba.a
|   `-- libb.a
|-- obj
|   |-- a.o
|   |-- aa.o
|   |-- b.o
|   |-- bb.o
|   `-- main.o
`-- src
    |-- a
    |   |-- a.f90
    |   `-- aa.F90
    |-- b
    |   |-- b.f90
    |   `-- bb.F90
    `-- main.f90

To build everything, simply call

> make
> ./main
> make clean

And Makefile itself looks like this

8< --- CUT HERE --- CUT HERE -- Makefile -- CUT HERE --- CUT HERE ---

# Some helper variables that will make our life easier
# later on
F90         := gfortran
INCLUDE     := -Iinclude    # I am storing mod files inside "include"
MODULES_OUT := -Jinclude    # directory,  but you may  preffer  "mod"
LIBS 	    := -Llib -la -lb

# Sources are distributted accros different directories
# and src itself has multiple sub-directories
SRC_A           := $(wildcard src/a/*.[fF]90)
SRC_B           := $(wildcard src/b/*.[fF]90)
SRC_MAIN        := $(wildcard src/*.[fF]90)

# As we can have arbitrary source locations, I want to
# make rule for each  source location  our aim here is
# to put all object files inside "obj"  directory  and
# we want to flattern the structure
OBJ_A           := $(patsubst src/a/%, obj/%,\
                     $(patsubst %.F90, %.o,\
                       $(patsubst %.f90, %.o, $(SRC_A))))

OBJ_B           := $(patsubst src/b/%, obj/%,\
                     $(patsubst %.F90, %.o,\
                       $(patsubst %.f90, %.o, $(SRC_B))))

OBJ_MAIN        := $(patsubst src/%, obj/%, \
                     $(patsubst %.f90, %.o, $(SRC_MAIN)))

# this is just a dummy target that creates all the
# directories, in case they are missing
dummy_build_folder := $(shell mkdir -p obj bin include lib)

# There are two ways of building main  file.  We can do it
# by linking all objects, or,  we can  link with libraries
# these two targets will build main slightly different way
all: bin/main bin/main_lib

# This target builds main using object files
bin/main: $(OBJ_MAIN) $(OBJ_A) $(OBJ_B)
	@echo $^
	$(F90) -o $@ $^

# This one, uses libraries built from sources a and b
bin/main_lib: $(OBJ_MAIN) lib/liba.a lib/libb.a
	@echo $^
	$(F90) -o $@ $^ $(LIBS)

# Library "a" contains only codes from sub-tree "a"
lib/liba.a: $(OBJ_A)
	@echo $^
	ar -rs $@ $^

# Library "b" contains only codes from sub-tree "b"
lib/libb.a: $(OBJ_B) lib/liba.a
	@echo $^
	ar -rs $@ $^

# We have to provide information how to build objects
# from the sources
obj/%.o: src/**/%.[fF]90
	$(F90) $(MODULES_OUT) -o $@ -c $< $(INCLUDE)

# main is slightly different as it lays at different
# level
obj/%.o: src/%.[fF]90
	$(F90) $(MODULES_OUT) -o $@ -c $< $(INCLUDE)

# We can do some cleaning aftewards. Clean should leave
# the directory in such a state  that only sources  and 
# Makefile are present left there
	- rm -rf obj
	- rm -rf bin
	- rm -rf include
	- rm -rf lib


jshell and command line arguments

If you start your experience with jshell, you will notice that passing command line arguments to your script may be a struggle. Typically, you would expect something like this

> jsell my_script.jsh arg1 'some other arg' yet_another arg

to be working such way, arguments are passed to your script. This is not the case here. The reason for this is, jshell takes a list of files as arguments and parses them for it’s own.

However, you can overcome this issue. And, you can even make it very flexible thanks to Apache ANT. Make sure to get ANT (e.g. 1.10) and put it somewhere. Also, make sure to set ANT_HOME such way it points to your ANT installation.

Then, you can do following inside script

8< -- CUT HERE --- CUT HERE ---- jshell_script_file ---- CUT HERE -- CUT HERE ---

  class A {
    public void main(String args[]) {
      for(String arg : args) {
  new A().main(Commandline.translateCommandline(System.getProperty("args")));


and you can call it like this

# -R will pass arguments for runtime. In this sample we pass -D and it sets system property "args"
# to value 'Some arg with spaces' $SHELL $TERM some_other_arg
> jshell --class-path $ANT_HOME/lib/ant.jar \
  -R-Dargs="'Some arg with spaces' $SHELL $TERM some_other_arg" \
Some args with spaces