The treasure-hunt for (vessel) data

treasure hunt

In my last blog post I mentioned three of the major issues we encountered while looking at the digitization processes of the shipping industry. This post is addressing one of them. It’s a story for the hunt of vessel data

The quest for the holy grail
Soon after we installed the first PBX’s on the vessels,we wanted to  collect operational data and see how the systems performed and fine tune it further. Don’t forget that this was part of our R&D effort to build a top of it’s class, and working, voip system over Fleetbroadband tcp/ip layer.
One of the key ingredients of our “secret voip sauce” was that we had modified several parameters of the VOIP codec we used, and based on the satellite link condition and the feedback we were getting from our clients we kept changing parameters trying to fine tune it.

It was soon evident that due to the varying communication conditions we were in need of a system that could “auto-tune” itself and adapt to the ever changing sat link.
We started logging and then analyzing all the critical parameters of our system. Due to the high bandwidth costs all the logging and analyzing was done at the system itself, on the vessel, which at the end resulted in our first “edge” analytics deployment.
When you try to log and analyze any real-time system (and that includes not just communications but also other high speed, real-time data) one of the key factors is correct timing. You need to time-stamp every data package you send and check that time-stamp on the receiving side to identify delay, jitter, out of order packets and other critical parameters.

What we realized early on, was that the clocks between our shore datacenter and the machines one the vessels were drifting apart.
The reason : the varying temperatures of the “datacenter” (which is a euphuism in most cases) in the vessels had an effect on the crystal generating the operating clock frequency and that translated in a very small drift in time synchronization, which over time kept growing to the point it added (or subtracted) tens of milli seconds in difference between the clocks.
This happens also on land, but on land its easy to fix, you set up a GPS synced time server and you (try to) keep a steady temperature in the datacenter.
Having access to a GPS in our case would also solve another “mythical” problem. It would allow us to have the exact position of the vessel!!
Another ‘holy grail’ in the path to digitization of the shipping industry.

 

Data is available but its stove-piped in hard to get,closed or proprietary systems.

 

 

We asked for access to any GPS data onboard the vessel, which we though it would be trivial to get, and at the end turned to be anything BUT trivial.
Connecting to the navigational NMEA bus, that had all the GPS info we wanted (and more) as a listener, was a definite “NO” from the client, as just the though of having to confront and explain to the Class society in an inspection what was that device doing there gave them the “chills”.

Option 1 is dead, let’s go for option 2 :  Install our own GPS onboard.
That did not work either, for a number of reasons. Installing it inside the bridge close to place it had access to the sky and having cables run over the bridge to the “datacenter” was a task no-one in our client wanted to handle (multiply this times the number of their vessels and you get the idea why.)
Installing an external GPS was a cost of close to 2000$ (device and install cost) and would probably meant we had to wait for a dry-docking, which did not make financial or practical sense.

So we ended up back in square-one.
The data was there but for a number of reasons we could not access it, nor could any other system or vendor as we found out later and that results in lots of out of sync clocks ,time-stamping data with so much difference that is practically impossible to correlate them later.
We kept trying to find a way to get to that GPS data but every time we tried,we hit another snag.

Finally we had our “aha” moment.
We remembered that the auto-tracking Satellite Dish has a GPS in order to be able to find its position and point to right direction in the sky.It also feeds this info to the Fleetbroadband terminal. So all we need, was a way to get that info from the terminal.
It “looked” easy but it was not. None of the people we asked knew how to access that information, and the companies making the terminals never replied to our requests. There were some clues on the internet but nothing that we could use on a production system.

We decided to stage a weekend “hackathon” at the office and find a way to get that information. (This hackathon is another interesting blog post , on how secure these satellite modems are #not)
So finally after several months of searching and banging our heads on “bulkheads” and walls we had our “holly grail”: Timing and position information!
Now  logging can work correctly and our “auto-tunning” started doing its work.

True to our open source spirit and the pay-it-forward mentality it uses, we opened access to this information to other vendors on the vessel, which did not believe that they could have it without paying a huge “license fee” or some other “ransom”.
What we have asked them (with varying degree of success I can admit) is to pay-it-forward themselves and open their data and services, not just to us but to anyone that had a legitimate use case.

Why this story, you might be wondering (if you have reach that far in reading it)
This story is to show how difficult is to access ANY data on a vessel, either because of physical limitations, regulations, complexity ,cost or the mentality of vendors that “closed” data is better and makes more money.It is exactly the opposite.
Data thrives in the presence of other data.

It’s unfortunate but in shipping we experience a severe data thirst while swimming in an ocean of data.

 

Let me give you one more example, of how “thirsty” for data all the stake holders in shipping are.

The case of AIS.

The automatic identification system (AIS) is an automatic tracking system used on ships and by vessel traffic services. It’s a situational awareness tool for a ship’s crew, augmenting it’s radar. Every ship transmits the position, speed, heading and destination (plus a few other things) , on a VHF frequency, so that other ships nearby “understand” it’s intensions (where its going basically), even if its not seen by the radar.

It’s a service that was designed for a very specific purpose, to notify near-by vessels and authorities, as its max range is somewhat limited by its specs to 10-20 nautical miles.

Given the need for information (or the lack of it), giving an answer to a simple question like “where is my vessel now ?” was not possible till a few years ago, when a large industry started involving around AIS.
Companies like MarineTraffic have build an extensive network of shore AIS listening stations, collecting , cleaning and aggregating AIS data from any place they can.

And it did not stop there, as once the vessel gets out of the range of the shore stations (crossing the Atlantic for example) it would seem to be entering a “black hole” where it disappeared, reappearing on the other side of the ocean.
It was (and still is) such a need, for what seems to be a simple piece of information, that companies have build and lunched AIS tracking satellites in order to cover the “black holes” and provide global AIS tracking coverage, with investment in these satellite constellations running in the billions

.

The thirst for shipping data, took a system that was designed for local vessel situation awareness, stretched it technically as far as it could, and produce a solution that now is not only used by the shipping companies to track their vessels, but is used by traders and hedge funds to estimate global market conditions, bunker suppliers use it to estimate the amount of stock they should keep in an area (based on the expected number of vessels) and a myriad of other applications.

From the data-lake to the data-swamp

There are other sources of information on vessels that provide a lot of potential for conversion from simple data points to actionable insights especially if they get “fused” together. The problem is that vendors either by ignorance or by policy make it very hard to access them. Multiply this by the number of different systems and the number of vessels a company has and soon your beautifswamp-thingul data-lake, has turned into a data-swamp.

If you think about it from a higher level, adopting new technologies like satellite communications,Industrial IoT,Analytics and Edge Computing  can transform vessel data to insights and offer a greater degree of visibility on vessel and cargo position and performance.This in turn produces tangible metrics that may be shared with the customers of shipowners and managers, as well as with other stakeholders.
But adopting new technologies isn’t the goal itself – it is important to focus on costs, revenues and risk reduction, identifying areas needing improvement, & then look at the tools that will help achieve that.
These are points I am going to address at a future post.