News from north.io

Best Practices for Seabed Surveys

Written by Admin | Oct 13, 2025

Surveying is expensive and complex, often costing in excess of 100,000 € per day, and the requirements - and constraints - of every survey are unique. In this time of uncertainty in the offshore industry, it is more important than ever that data is not only fit to purpose, but also accessible and readable by as many different users as possible, so that the output value per Euro invested is maximized. 

How can you make sure your data is fit to purpose? 

Define quality metrics upfront 

What questions do you want to answer? What do you expect your data to show, confirm, or assess? Deciding what resolution your data should have is a key part of survey preparation, and there is always a trade off: Do you image a larger area with lower swath overlap and lower point density, or do you maximize the resolution in a smaller area? Each option has its pros and cons, and defining how to assess the quality of your data in advance is key to ensuring that the objectives of your survey can be met.  

Specify survey parameters clearly 

The required resolution informs the type of sonar device that you need to deploy, as well as frequency ranges, line spacing, and vessel speed over ground. These parameters all need to be defined and communicated clearly prior to data acquisition. Deciding on and communicating weather condition limits in advance – factors such as maximum wave height or wind speed – also helps operators to make sure that the acquired data meets the requirements. 

Standardize data transfer 

Data transfer via hard drive is inefficient. Drives can be incompatible with different operating systems, and physical transfer or sending hard drives adds risk and potential delays. This puts the most important steps, data processing and decision-making, on hold. Transferring data using secure SFTP/cloud platforms ensures that data arrives promptly and in a complete state following acquisition, speeding up the whole process. 

Enforce clear CRS usage 

The use of consistent geodetic metadata is critical for preventing inconsistencies and avoiding potentially costly mistakes. This is especially relevant in regions where multiple Coordinate Reference Systems (CRS) are in use, such as in the North Sea, where the ED50 and WGS84 datums are both used in UTM zone 31N. An erroneous choice of CRS during processing can offset a dataset by several hundred metres: Not such a problem if quickly realised; disastrous if unnoticed. To avoid this, CRS need to be defined in advance of the survey and saved correctly and clearly in the (meta)data. 

Standardize and validate metadata 

Clear and standardized metadata is critical for making the most out of the data and ensuring reusability. Moreover, having well documented data and accurate metadata makes meeting reporting requirements more straightforward. The first step following receipt of the data should always be to validate the metadata – check the vessel tracks, timestamps, and sensor calibration logs. Make sure you know the CRS, and that the data is correctly projected.  

Where does north.io come into this? 

With our experienced team of hydrographers, geophysicists, big data experts, and software developers we guide you from ocean data survey requirements to data management, and ocean big data processing. With more than ten years of experience working with ocean big data, maritime data as well as geospatial data we offer services and technologies that make ocean big data handling easier. Together with our clients we assess the current ocean data needs and challenges and jointly develop solutions to real operational problems.  

Reach out to schedule a personal discovery call.