From panic to zen in a few lines of code

The next story is a morphed version of a real life one.

Here we go:
There is this process, the evolution of which can be tracked with one parameter, call it P.
According to science, P has a normal low value, say Lo and a normal hi value, say Hi.

The process is carefully being tracked, and after a short period of time P seems to break through the Hi value barrier and looks like set for a journey to the stratosphere.
Here’s the diagram.


“How far must this thing go before I should consider running out of here?”, is the question, gently put to the process specialists by the engineer.
And quickly came the answers:

“OMG, this is disaster, we need to stop this.
No, yes, maybe.
I don’t know. OMG. OMG!”

This was the panic stage.

Now we go for the zen status

Take a deep breath, do some research, and find out that there are other limits to P, limits that “specialists” obviously have forgotten:

A1 = 2.5 Hi, mild problem
A2 = 5 Hi, moderate problem
A3 = 20 Hi, severe problem.
Anything above A3: start running.

So, with the time-series at hand, and the knowledge about  the levels, and a few lines of code, Wolfram Language by the way, the engineer produced this diagram.


And zen broke out.
Indeed, the forecast – black dotted line – and the 96% confidence band at the end of the planned process run, indicated that P would stay well within the “moderate problem” band, and nothing was to be feared.

Closing questions
How many times were you and team in panic state, where zen should have been the case?
More dangerous: How many times have you and team experienced zen, where you should  have been, well, at least concerned?

  • A good understanding of the process,
  • A good dose of data science,
  • And a few lines of code – given the right tools :

The best recipe yet available for permanent zen.

Also sitting on a heap of untapped knowledge?

… that could eventually save tons of money or even a few lives now and then?

Look at what Dr Romke Bontekoe did with open source data on emergency calls logged by the Amsterdam fire brigade over the past years.
And wonder with me about the savings in money and perhaps even lives that could be achieved by properly exploiting his findings.

Watch the video here.


You can also interact with the reports here

You wonder how you could get from here to there?
Simple: just ask.
We will get you there in no time, with a little help of Wolfram Language.

Are we all going to drown soon?

A couple of years ago, it was announced, without a shred of a doubt, that the town where I live, Nieuwpoort on the Belgian coast, would be flooded due to the rise of the sea level – 1m over 100 years.

So I started digging in available open source data on tidal observation, in this case from the BODC – British Oceanographic Data Center. They cover 90+ years of observations, at tens of stations, one point per half hour.
Speaking of a time series

I analysed 5 433 000 data points.
I did this 10 times over under pressure from Dr Romke Bontekoe, who insisted on varying a parameter – the spectrum width – in order to eliminate the possibility of generating artefacts.
So in total 54 330 000 data points where handled.

The results are shown in the video.



I can sleep quietly: if sea level rises by 1m over 100 years, we will see a strong signal in about 30 years, … and I won’t be around anymore.

With thanks to

Fed up waiting for that 1 diagram?

Me too.
That’s the reason why I quit asking a long time ago.
In this video I show the results obtained after doing some analysis on a survey containing 16716 entries and 228 answers per entry.

Not 1, but thousands of diagrams made available in an interactive environment.
It took me 4 hours of work, or as we now say, 4 hours of think-coding.

Then I could spend all the time I wanted  looking for the hidden secrets in the data.

That is what happens when you use the right tools … and learn to think-code.

And, by the way, the document, a notebook, is ready for distribution.

Prof. Harari(*) I do not agree with you.

No Prof. Harari, we will not all end up being useless.
At least not all of us.
AI is not going to overtake us, neither will robots.

Remember the early years of the 20th century when Zeppelins where predicted to become part of everything human: travel, leisure, transportation, war. One prediction that came down crashing and burning, Literally.

Remember the early fifties, when nuclear power was predicted to appear everywhere: ships, planes, cars, households.
It turned out to be ‘a bit’ less invasive.

I had this professor who advised us to have more confidence in natural stupidity than in artificial intelligence. This still holds.
Honestly, what will you do next time you take a plane and are welcomed in these words: “I am captain C3PO, artificially intelligent, and my success rate with landings is 85%
By the way, 85% is already quite good in AI.

AI – and robots – are just tools at our service, and will remain so.
But, indeed, in view of the nature of the new tools, some of them available already now, we do need to make a quantum leap in acquiring the skills to use them at our advantage, without overdoing and most importantly, without drowning in a flood of crap.
Maybe that our biggest challenge for the near future will be to re-instate our old “enlightened” habit of thinking for ourselves, to abandone our recently (re-)acquired sheepish and obedient style of thinking ( better: non-thinking )
Let us embrace those tools that will help us in that respect.

One such tool has been around for quite a number of years now: the Wolfram Language, of which Mathematica is one of the best known expressions.
WL puts the focus on computing, and does that in almost every field of human activities. In so doing it contracts time between problem formulation and solution in an impressive and unparalleled way.
I use a new verb to describe this experience: ‘to thinkcompute‘, because you think the problem and compute its solution at the same time.

If some time in the not-so-far future, there will be a differentiator between useful and not-so-useful people, it will be between those who read, write and speak WL, and those who don’t.

So, there is at least one way to avoid becoming useless, which proves my point.


(*) Author of Sapiens and Homo Deus

Zis is a special one – Only for the Dutch speaking

This is an experiment.
It is still very much beta, so be careful when trying it out: it might explode.

It is a “Gentile Text Analyser”, code named “BS Buster”.
Why “BS”? Well therefore!

It takes a text and puts you at the controls so you can analyse it in depth, looking for keywords, key sentences, context and correlations.

The text used in this experiment is the policy declaration of the current Flemish Government. Hence the reference to the Dutch speaking.
It is a very serious text, and so it deserves to be handled with deference.

This is the link to the start page.
You might want te start with the tutorial.

This experiment has been set up using the Wolfram Language and is deployed in the Wolfram Cloud.

If you would like to try the BS Buster with another text – legal stuff, contracts, reports of all kind – feel free to contact at
Same goes for feedback.


FEM for the rest of us

FEM – Finite Elements Method – is a numerical computing method for solving complicated engineering problems.
You don’t need to be a FEM specialist to benefit from it.
As long as you know your trade and are capable of formulating your problem clearly, we  can handle the computation.
And you can analyse the results, even modifying some design parameters.

Here is a very simple example, just to get a taste of it (*).
Much more complicated cases will come soon.

This experiment has been set up using the Wolfram Language and is deployed in the Wolfram Cloud.

(*) Even if you know jot about engineering, you can still have a go!
This is a hint to my many not engineering-savvy friends.

Analytics for the rest of us

I partnered with some of the best data scientist of the Low Lands to start a new venture.
Yes, it’s about data analysis.

We only use the Wolfram Language and deploy our apps in the Wolfram Cloud.


  • Unmatched speed of development
  • Unmatched spectrum of functionality

Here is a first example

These friends contributed to this example:

  • Chief egg head: Dr Romke Bontekoe
  • Helping hand: moi
  • Chief inspector of the works: Michiel van Mens

More will come soon.

For a New Project Culture

The 5 basic principles of DPC – Dynamic Project Control – are gaining traction

  1. To be able to drive a project we need feedback information on its behaviour
  2. To obtain feedback information we need to monitor – to track – the physical progress of the project
  3. To be able to track a project we need a planning
  4. The process of constructing a planning and things to avoid
  5. The process of monitoring and the best practices

The DPC method rests on a body of knowledge crafted over 30+ years of experience. On this page you will find a collections of papers and publications that embody this knowledge.

If you need more information, if you consider a training session, do not hesitate to contact me at


SMSHAR White Paper

Progress Velocity

Progress Velocity

Integrated Project Scheduling and Monitoring with Smartsheet and DPC

The white paper on SMSHAR has been issued. You can find it on this page.

This is an excerpt from the introduction:

This a backgrounder on integrated project scheduling and monitoring using Smartsheet to construct the project schedules and the DPC engine for automatic project tracking.

As soon as we discovered Smartsheet and in particular its Gantt chart, we decided to integrate it with our DPC project monitoring engine.
As soon as Smartsheet had its API up and running, we even went one step further: we organized a fully automated project reporting system.
By so doing, we were able to bring the workload for tracking and reporting down to zero. That is bar the minimum effort of entering the percentage complete values on the tasks in the Smartsheet Gantt.

On top of that we recently deployed a portfolio report, also automated, that tracks virtually any number of projects part of a portfolio. This is under the assumption that every project taken separately is monitored by the DPC engine.
This is unprecedented.

To be able to monitor every single project of a portfolio, while being able to access any detail of the individual progress report, is a position that has never been achieved before.
The magic of the automation based on the Mathematica technology developed by Wolfram Research coupled to the very accessible Smartsheet environment, with, as said before, almost no workload at all for the user, makes it possible to run the system at a very low cost.

We are happy that we already have a 41 projects portfolio, monitored individually and as a set, in the system.
We call the system SMSHAR (pronounced “sm-shar”) , which stands for “Smartsheet Automated Reporting”.

This paper provides a brief introduction on the essential aspects that together define SMSHAR.
It is assumed that the reader has a basic knowledge about project scheduling and monitoring.