REPMUS 2025: north.io enables almost real-time Naval Data Analysis
When NATO gathered thousands of participants for REPMUS & Dynamic Messenger 2025 off the Portuguese coast, the focus was clear: testing how...
Surveying is expensive and complex, often costing in excess of 100,000 € per day, and the requirements - and constraints - of every survey are unique. In this time of uncertainty in the offshore industry, it is more important than ever that data is not only fit to purpose, but also accessible and readable by as many different users as possible, so that the output value per Euro invested is maximized.
What questions do you want to answer? What do you expect your data to show, confirm, or assess? Deciding what resolution your data should have is a key part of survey preparation, and there is always a trade off: Do you image a larger area with lower swath overlap and lower point density, or do you maximize the resolution in a smaller area? Each option has its pros and cons, and defining how to assess the quality of your data in advance is key to ensuring that the objectives of your survey can be met.
The required resolution informs the type of sonar device that you need to deploy, as well as frequency ranges, line spacing, and vessel speed over ground. These parameters all need to be defined and communicated clearly prior to data acquisition. Deciding on and communicating weather condition limits in advance – factors such as maximum wave height or wind speed – also helps operators to make sure that the acquired data meets the requirements.
Data transfer via hard drive is inefficient. Drives can be incompatible with different operating systems, and physical transfer or sending hard drives adds risk and potential delays. This puts the most important steps, data processing and decision-making, on hold. Transferring data using secure SFTP/cloud platforms ensures that data arrives promptly and in a complete state following acquisition, speeding up the whole process.
The use of consistent geodetic metadata is critical for preventing inconsistencies and avoiding potentially costly mistakes. This is especially relevant in regions where multiple Coordinate Reference Systems (CRS) are in use, such as in the North Sea, where the ED50 and WGS84 datums are both used in UTM zone 31N. An erroneous choice of CRS during processing can offset a dataset by several hundred metres: Not such a problem if quickly realised; disastrous if unnoticed. To avoid this, CRS need to be defined in advance of the survey and saved correctly and clearly in the (meta)data.
Clear and standardized metadata is critical for making the most out of the data and ensuring reusability. Moreover, having well documented data and accurate metadata makes meeting reporting requirements more straightforward. The first step following receipt of the data should always be to validate the metadata – check the vessel tracks, timestamps, and sensor calibration logs. Make sure you know the CRS, and that the data is correctly projected.
With our experienced team of hydrographers, geophysicists, big data experts, and software developers we guide you from ocean data survey requirements to data management, and ocean big data processing. With more than ten years of experience working with ocean big data, maritime data as well as geospatial data we offer services and technologies that make ocean big data handling easier. Together with our clients we assess the current ocean data needs and challenges and jointly develop solutions to real operational problems.
When NATO gathered thousands of participants for REPMUS & Dynamic Messenger 2025 off the Portuguese coast, the focus was clear: testing how...
Last week the Marispace-X coordinator north.io took part in the Gaia-X Workshop VIII. This event was dedicated to standardisation in the data...
We’re excited to announce our latest innovative feature on all north.io data platforms: Geotagging.
2024 has been a landmark year for north.io, marked by significant achievements that have propelled the company to the forefront of geospatial...