A complete A-Z of key terms in non-seismic geophysical surveying.


An accelerometer is a tool that measures Proper Acceleration. Proper Acceleration is the acceleration (the rate of change of velocity) of a body in its own instantaneous rest frame; this is different from Coordinate Acceleration, which is acceleration in a fixed coordinate system. For example, an accelerometer at rest on the surface of the Earth will measure an acceleration due to Earth’s gravity, directly upwards (by definition) of g ≈ 9.81 m/s2. By contrast, accelerometers in free fall (falling toward the centre of the Earth at a rate of about 9.81 m/s2) will measure zero.

When two or more accelerometers are coordinated with one another, they can measure differences in gravity, over their separation in space, in other words, the gradient of the gravitational field. Gravity Gradiometry is useful because absolute gravity is a weak effect and depends on the local density of the Earth, which can be highly variable.

An entity or property that differs from what is typical or expected, or which differs from that predicted by a theoretical model. Can also be the measurement of the difference between a measured value and the expected values of a physical property.

Anomalies are of great interest in resource exploration because they often indicate mineral, hydrocarbon, geothermal or other prospects and accumulations, as well as geological structures such as folds and faults.

The AGMA is an integral part of the eFTG instrument design and measures Scalar Gravity. The design of the AGMA is such that the gravimeter is located on the same stabilised platform as the gradiometer, meaning that the measurements are all on the same axis.


Band-pass refers to a type of filter or a frequency range that allows certain frequencies to pass through while attenuating or rejecting others. In magnetic and gravity surveys, a band-pass approach may be used to enhance the interpretation of specific geophysical anomalies. By applying a band-pass filter to the data, certain anomalies associated with target geological structures or mineral deposits can be enhanced, allowing geophysicists to focus on relevant features.

In geology, the terms basement and crystalline basement are used to define the rocks below a sedimentary basin or cover, or more generally any rock below sedimentary rocks or sedimentary basins that are metamorphic or igneous in origin. In the same way, the sediments and/or sedimentary rocks above the basement can be called ‘sedimentary cover’.

This anomaly is named after a French mathematician Pierre Bouguer (1698–1758) who proved that gravitational attraction decreases with altitude. It is the remaining value of gravitational attraction after accounting for the theoretical gravitational attraction at the point of measurement, latitude, elevation, the Bouguer correction, and the free-air correction (which compensates for height above sea level assuming there is only air between the measurement station and sea level).

The Bureau Gravimetrique International (BGI) – WGM Release 1.0 (2012) World Gravity Map is the first release of high-resolution grids and maps of the Earth’s gravity anomalies (Bouguer, isostatic and surface free-air), computed at global scale in spherical geometry.

WGM2012 gravity anomalies are derived from the available Earth global gravity models EGM2008 and DTU10 and include 1’x1′ resolution terrain corrections derived from ETOPO1 model that consider the contribution of most surface masses (atmosphere, land, oceans, inland seas, lakes, ice caps and ice shelves). These products have been computed by means of a spherical harmonic approach using theoretical developments carried out to achieve accurate computations at global scale.

Boresight adjustment, also known as Boresighting, is a procedure used to align or calibrate the pointing direction of a sensor or instrument with respect to a reference coordinate system. The term “boresight” refers to the line of sight or optical axis of the sensor. Boresighting is commonly performed in remote sensing and optical instrumentation.

The purpose of boresight adjustment is to ensure that the sensor accurately points in a known direction, allowing for precise measurements or observations. Inaccurate boresight alignment can lead to errors in data collection, misinterpretation of results, or misalignment of targeting systems.


In geophysics, a “Calculation Surface” can refer to a specific elevation or depth level within the Earth’s subsurface at which geophysical calculations are conducted. For example, in gravity or magnetic data processing, geophysicists might work with data on different calculation surfaces to interpret and model subsurface geological structures or anomalies.

Carbon capture and storage (CCS) is a process in which a relatively pure stream of carbon dioxide (CO2) from industrial sources is separated, treated, and transported to a long-term storage location. For example, CO2 stream that is to be captured can result from burning fossil fuels or biomass. Usually, the CO2 is captured from large sources, such as a chemical plant or biomass plant, and then stored in an underground geological formation. The aim is to reduce greenhouse gas emissions and thus mitigate climate change.

The model, CRUST1.0, serves as starting model in a more comprehensive effort to compile a global model of Earth’s crust and lithosphere, LITHO1.0. CRUST1.0 is defined on a 1-degree grid and is based on a new database of crustal thickness data from seismic studies as well as from receiver function studies. In areas where such constraints are still missing, for example in Antarctica, crustal thicknesses are estimated using gravity constraints.


The DC-3 aircraft, also known as the Douglas DC-3 or Dakota, was first developed and produced by the Douglas Aircraft Company in the 1930s and the design has been extensively modified and improved over the decades, resulting in an extremely safe and robust aircraft. It is widely regarded as one of the most significant transport aircraft in aviation history and played a crucial role in revolutionising air travel.

The SI unit of kilogram per cubic metre (kg/m3) and the cgs unit of gram per cubic centimetre (g/cm3) are the most commonly used units for density. One g/cm3 is equal to 1000 kg/m3. One cubic centimetre (abbreviation cc) is equal to one millilitre.

The de Havilland Canada DHC-6 Twin Otter is a Canadian STOL (Short Takeoff and Landing) utility aircraft developed by de Havilland Canada. The aircraft’s fixed tricycle undercarriage, STOL capabilities, twin turboprop engines, high rate of climb as well as its versatility and manoeuvrability have made it a successful surveying platform, particularly in areas with difficult flying environments.

A DEM is a 3D computer rendered representation of elevation data to represent terrain of the Earth’s surface. A “global DEM” refers to a discrete global grid. DEMs are used often in Geographic Information Systems (GIS) and are the most common basis for digitally produced relief maps. A Digital Terrain Model (DTM) represents specifically the ground surface while DEM and DSM may represent tree top canopy or building roofs.

While a DSM may be useful for landscape modelling, city modelling and visualisation applications, a DTM is often required for flood or drainage modelling, land-use studies and geological applications.

Metatek also uses DTMs to provide a terrain correction for gravity surveying.

The term ‘Dipole’ generally refers to a magnetic dipole, which is a simple model used to represent the magnetic field produced by a magnetic source. In geophysics, magnetic dipoles are often used to describe the magnetic field generated by certain geological structures or magnetic materials.

In magnetic surveys, magnetic dipoles are often used as a simplified model to represent certain geological structures, such as ore bodies, magnetic anomalies, or magnetic bodies with well-defined magnetic properties. By understanding and characterising the magnetic field produced by magnetic dipoles, geophysicists can interpret and analyse the magnetic data obtained during surveys to gain insights into the subsurface geology and potential mineral deposits.


Equivalent Source processing, also known as the Equivalent Source Method (ESM) or Equivalent Layer technique, is a geophysical data processing and interpretation method used in potential field geophysics. It is primarily applied to magnetic and gravity data to model and interpret subsurface geological structures and density variations.

The concept behind the Equivalent Source processing is to represent the observed magnetic or gravity anomalies as if they were produced by a set of hypothetical point sources or “equivalent sources” located at a certain depth beneath the Earth’s surface. These equivalent sources mimic the observed anomalies, and by determining their distribution, depth, and strength, geophysicists can infer information about the subsurface geology.


A pre-survey study to assess the suitability of geophysical techniques to achieve the survey objectives. Different platforms, instruments, terrain line spacing etc. can be considered. Metatek’s feasibility studies usually take the form of a 2D section(s) developed into a full 3D earth model, to allow a realistic, simulated survey response to be calculated.

The term “Figure of Merit” (FOM) is used in various fields to quantify the performance, quality, or effectiveness of a particular system, process, or measurement. It is a numerical value or metric that provides an objective way to compare different alternatives, evaluate the efficiency of a system, or assess the success of an operation.

In geophysical exploration, FOM is employed to assess the success of surveys or measurements. For instance, in geophysical data processing, the FOM may be used to evaluate the signal-to-noise ratio or the resolution of the data.

In geophysics, the Free-air Gravity Anomaly, often simply called the Free-air Anomaly, is the measured gravity anomaly after a free-air correction is applied to account for the elevation at which a measurement is made. It does so by adjusting the measurements of gravity to what would have been measured at a reference level, which is commonly taken as mean sea level or the Geoid.

In the case of gravity and magnetic airborne surveying, a line plan is a survey design used to minimise terrain clearance throughout the survey area of interest (AOI). It is important in gravity surveying in particular to try to acquire data as close to the ground as is possible after taking into account all safety considerations. Typically, surveys are planned for a 120m constant flight height, but the actual terrain clearance in hilly or mountainous areas will be higher.  To achieve the optimal survey line plan in areas with topography, a 3D Drape Analysis is undertaken which allows for a more flexible acquisition footprint to achieve the optimal terrain clearance within the constraints of the topography. The main survey transact lines are known as Cross Lines and Tie Lines are also planned for and are primarily used to level magnetics and long wavelength gravity data from the AGMA.

Forward modelling in 2D is a widely used and a critical method to integrating potential field and seismic data. It is a convenient method to identify sources of anomalies where constraints exist. It also allows any number of simple model scenarios to be tested.

In physics, the Fourier transform (FT) is a transform that converts a function into a form that describes the frequencies present in the original function. The output of the transform is a complex-valued function of frequency. The term Fourier transform refers to both this complex-valued function and the mathematical operation. When a distinction needs to be made, the Fourier transform is sometimes called the frequency domain representation of the original function.

The Fourier analysis of gravity data is a geophysical data processing technique used to analyse and interpret gravity data obtained from gravity surveys. Fourier analysis is a mathematical method that decomposes complex functions or data sets into a series of simple sinusoidal functions, known as sine and cosine functions. In the context of gravity data, Fourier processing involves transforming the gravity measurements from the spatial domain (latitude, longitude, and elevation) to the frequency domain using Fourier transforms. Fourier analysis is particularly useful for detecting anomalies associated with geological features, such as faults, salt domes, or mineral deposits, which exhibit distinct spatial frequency patterns in the gravity field.

To ensure the accuracy and reliability of the data collected during airborne geophysical surveys, flight calibrations are performed before and after each survey flight. Flight calibrations involve various procedures to establish baseline measurements, account for instrument drift, and validate the data quality.

Full Tensor Gradiometers measure the rate of change of the gravity vector in all three perpendicular directions giving rise to a Gravity Gradient Tensor.

ftg instrument

Conventional gravity measures ONE component of the Gravity Field in the Vertical Direction (Gz) (LHS), Full tensor gravity gradiometry measures ALL components of the gravity field (RHS)


A geographic information system (GIS) consists of integrated computer hardware and software that store, manage, analyse, edit, output, and visualise geographical data (usually geospatial in nature: GPS, remote sensing, etc.) The core of any GIS is a spatial database that contains representations of geographical phenomena, modelling their geometry (location and shape) and their properties or attributes. A GIS database may be stored in a variety of forms, such as a collection of separate data files or a single spatially enabled relational database.

Geophysics is a branch of natural science concerned with the physical processes and physical properties of the Earth and the use of quantitative methods for its analysis. Geophysicists, who usually study geophysics, physics, or one of the earth sciences at the graduate level, complete investigations across a wide range of scientific disciplines. The term geophysics classically refers to solid earth applications: Earth’s shape; its gravitational, magnetic fields, and electromagnetic fields; its internal structure and composition.

Gravity forward modelling (GFM) is the Computation of the gravity field of some given mass distribution. Forward Modelling of gravity is a technique used in geophysics to predict the gravitational field produced by subsurface structures or distributions of mass. This method is employed to understand the subsurface composition and geological features by simulating how gravity behaves due to various subsurface arrangements.

The unit of gravity gradient is the eotvos (abbreviated as E), which is equivalent to 10−9 s−2 (or 10−4 mGal/m). A person walking past at a distance of 2 metres would provide a gravity gradient signal of approximately one E. Large physical bodies such as mountains can give signals of several hundred Eotvos.

An instrument used to measure the acceleration due to gravity, or, more specifically, variations in the gravitational field between two or more points. The change from calling a device an “accelerometer” to calling it a “gravimeter” occurs at approximately the point where it has to make corrections for earth tides.

Gravity measurements are a reflection of the earth’s gravitational attraction, its centripetal forcetidal accelerations due to the sun, moon, and planets, and other applied forces.

Horizontal and Vertical Gradients, and Filters based on them such as the analytic signal, tilt angle, theta map and so on, as edge detection play an important role in the interpretation and analysis of gravity field data. Normalised Derivatives methods are used to equalise signals from sources buried at different depths.

Gravity Gradiometry is the study and measurement of variations (anomalies) in the Earth’s gravity field. 

Gravity gradiometers are instruments which measure the spatial derivatives of the gravity vector. The most frequently used and intuitive component is the vertical gravity gradient, Gzz, which represents the rate of change of vertical gravity (gz) with height (z). It can be deduced by differencing the value of gravity at two points separated by a small vertical distance, l, and dividing by this distance.  The two gravity measurements are provided by accelerometers which are matched and aligned to an extremely high level of accuracy. Metatek uses the latest generation Lockheed Martin Gravity Gradiometers.

The Gravity Gradient Tensor is the spatial rate of change of gravitational acceleration; as acceleration is a vector quantity, with magnitude and three-dimensional direction. The full gravity gradient is a 3×3 tensor.

Three-dimensional gravity inversion is an effective way to extract subsurface density distribution from gravity data. Machine-learning-based inversion is a newer data-driven method for mapping the observed data to a 3D model.

The Global Positioning System (GPS) is a satellite-based radio navigation system owned by the United States government and operated by the United States Space Force. It is one of the Global Navigation Systems (GNSS) that provides geolocation and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites.

It does not require the user to transmit any data, and operates independently of any telephonic or Internet reception, though these technologies can enhance the usefulness of the GPS positioning information. It provides critical positioning capabilities to military, civil, and commercial users around the world. Although the United States government created, controls and maintains the GPS system, it is freely accessible to anyone with a GPS receiver.

Ground-penetrating radar GPR is a geophysical method that uses radar pulses to image the subsurface. It is a non-intrusive method of surveying the sub-surface to investigate a variety of media, including rock, soil, ice, fresh water, and man-made structures. In the right conditions GPR can be used to detect subsurface objects, changes in material properties, and voids and cracks.

Individual lines of GPR data represent a sectional (profile) view of the subsurface. Multiple lines of data systematically collected over an area may be used to construct three-dimensional or tomographic images. Data may be presented as three-dimensional blocks, or as horizontal or vertical slices. Horizontal slices (known as “depth slices” or “time slices”) are essentially planview maps isolating specific depths.


Hyperspectral imaging collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, with the purpose of finding objects, identifying materials, or detecting processes.

Hyperspectral sensors look at objects using a vast portion of the electromagnetic spectrum. Certain objects leave unique ‘fingerprints’ in the electromagnetic spectrum. Known as spectral signatures, these ‘fingerprints’ enable identification of the materials that make up a scanned object. For example, a spectral signature for oil helps geologists find new oil accumulations.


The main metric for estimating FTG data noise is the normalised inline sum:


In this summation, the tensor signals that make up the inline components of differential curvature (I1, I2, I3) cancel, leaving a quantity that reflects the overall noise of the eFTG across its 3 Gravity Gradient Instruments (GGIs). The normalisation factor of 1 / 3 gives the noise per eFTG channel.

The IOGP is the petroleum industry’s global forum in which members identify and share best practices to achieve improvements in health, safety, the environment, security, social responsibility, engineering and operations.

The IAGSA promotes the safe operation of helicopters and fixed-wing aircraft on airborne geophysical surveys. Member companies conduct low-level survey flights and are committed to safe aircraft operations. The association develops recommended practices, serves as a centre for exchange of safety information and as a repository for operational statistics.

International Traffic in Arms Regulations (ITAR) is a United States regulatory regime to restrict and control the export of defence and military related technologies to safeguard U.S. national security. This essentially means that certain technologies, including the LHM gradiometers are regulated under the ITAR rules because of military applications and therefore cannot be used to survey in certain countries. The Export Administration Regulations (EAR) are a set of United States export guidelines and prohibitions administered by the Bureau of Industry and Security which are associated with ITAR.

Intersection analysis is a spatial analysis technique used in geographic information systems (GIS) and other fields to examine the relationships between different spatial datasets and identify common features or areas of overlap. The primary goal of intersection analysis is to determine where spatial entities from two or more datasets intersect or overlap, providing valuable insights into their spatial relationships and characteristics.


Laplacian Gridding is a technique used in potential field geophysics, to interpolate irregularly spaced data onto a regular grid. It is commonly applied in gravity and magnetic data processing, where irregularly spaced measurements are collected during airborne or ground-based surveys.

The Laplacian gridding method derives its name from the Laplacian Operator, which is a mathematical operator used to calculate the divergence or curvature of a scalar field. In the context of gridding, the Laplacian operator is applied to the irregularly spaced data to estimate the values at grid nodes.

LiDAR is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. LiDAR may operate in a fixed direction (e.g., vertical) or it may scan multiple directions, in which case it is known as LiDAR scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. LiDAR has terrestrial, airborne, and mobile applications.

Airborne LiDAR is when a laser scanner, while attached to an aircraft during flight, creates a 3-D point cloud model of the landscape. This is currently the most detailed and accurate method of creating Digital Elevation Models.

Lockheed Martin Corporation (LHM) is an American aerospace, arms, defence, information security, and technology corporation operating worldwide. It was formed by the merger of Lockheed Corporation with Martin Marietta in March 1995. It is headquartered in the Washington, D.C. area in the US. They also manufacture the most advanced Gravity Gradiometers in the world.

During the 1970s, as an executive in the US Dept. of Defence, John Brett initiated the development of the gravity gradiometer to support the Trident 2 system. A committee was commissioned to seek commercial applications for the Full Tensor Gradient (FTG) system that was being deployed on US Navy Ohio-class Trident submarines designed to aid covert navigation. As the Cold War ended, the US Navy released the classified technology and opened the door for full commercialisation of the technology.

There are several types of Lockheed Martin gravity gradiometers currently in operation: the 3D Integrated Full Tensor Gravity Gradiometer (iFTG) and the Enhanced Full Tensor Gravity Gradiometer (eFTG); deployed in either a fixed wing aircraft or a vessel and the FALCON gradiometer (a partial tensor system rather than a full tensor system and deployed in a fixed wing aircraft or a helicopter).

The 3D FTG system contains three Gravity Gradiometry Instruments (GGIs), each consisting of two opposing pairs of accelerometers arranged on a spinning disc with measurement direction in the spin direction. The full gravity gradient tensor is sensed by an Umbrella Configuration of three rotating Gravity Gradiometry Instruments (GGIs).

eftg gravity gradiometer
eFTG Gravity Gradiometer Image courtesy of Lockheed Martin
ftg & eftg
FTG & eFTG GGI Configurationn


The tesla (symbol: T) is the unit of magnetic flux density (also called magnetic B-field strength) in the International System of Units (SI). The nanotesla: nT (one nanotesla equals 10−9 tesla), is commonly used in geophysical applications.

A magnetometer is an instrument that measures the magnetic field or magnetic dipole moment. Different types of magnetometers measure the direction, strength, or relative change of a magnetic field at a particular location. Magnetometers are widely used for measuring the Earth’s magnetic field, in geophysical surveys, to detect magnetic anomalies of various types.

Magnetometers assist mineral explorers both directly (i.e., gold mineralisation associated with magnetite, diamonds in kimberlite pipes) and, more commonly, indirectly, such as by mapping geological structures conducive to mineralisation (i.e., shear zones and alteration haloes around granites). In oil and gas exploration, the delineation of faults, shear zones, volcanics and other geological structures is important in de-risking reservoir and seal.

The magnetic analytic signal is a mathematical technique used in magnetic data processing and interpretation. It is employed to enhance the detection and delineation of magnetic anomalies associated with subsurface geological structures or magnetic sources.

Magnetic data collected in airborne or ground-based magnetic surveys often contain complex anomalies caused by various geological features, such as faults, dykes, ore bodies, and other magnetic sources. The magnetic analytic signal is a useful tool for highlighting the locations and edges of these magnetic sources, making them stand out from the background magnetic field.

The magnetic analytic signal effectively locates the derived anomaly over the magnetic source irrespective of the direction of magnetisation. It operates using Cartesian derivatives and is therefore very high-pass in nature and very much dominated by the shallower sources. The magnetic analytic signal has several important properties that make it valuable in geophysical interpretation

In the context of magnetic surveying or geophysical exploration, a Magnetic Base Station refers to a specific fixed location during a survey where the temporal Earth’s magnetic field is monitored for variations during magnetic surveys. These variations are caused by electric currents in the upper atmosphere are superimposed on the magnetic survey data. The magnetic base station data is used to correct the survey data for these variations. When the Magnetic Base Station records excessive temporal disturbances in the magnetic field the surveying is suspended until the magnetic field disturbances subside again to within acceptable limits.

The magnetic declination and inclination are essential properties of the Earth’s magnetic field. The predominant source of the magnetic field is the dynamo effect inside the planet. The dynamo effect is a naturally occurring phenomenon in which heat from the Earth’s core produces a series of electric currents, which creates a magnetic field. A three-dimensional vector represents the magnetic field of the Earth at any location. An ordinary compass is sufficient to measure the direction of the magnetic field.

Magnetic Declination of the Earth, or magnetic variation, is the angle formed between the magnetic north of the compass and the true geographical north. The value of the declination changes with location and time. It is represented by the letter D or the Greek alphabet δ.

The value of declination is positive if the magnetic north is along the east side of the true north and negative if the magnetic north is along the west side of the true north.

Isogonic Lines join points on the Earth’s surface which have a common declination value that is also constant. Agonic Lines are lines along with the value of declination is zero.

Magnetic Inclination, or Magnetic Dip, is the angle formed between the earth’s surface and the planet’s magnetic lines. The magnetic inclination can be observed when a magnet is trying to align itself with the earth’s magnetic lines.

Since the earth is not flat, the magnetic field lines are not parallel. Hence, the compass needle’s north end will be either upward (southern hemisphere) or downward (northern hemisphere). The degree of inclination varies with the location on the earth.

Magnetic reduction to pole and magnetic reduction to the equator are data processing techniques used in magnetic surveys, to transform the measured magnetic field data to equivalent values that would be observed if the measurements were made at the either magnetic poles or the magnetic equator, respectively. These transformations are applied to simplify the interpretation of the magnetic anomalies by transforming a dipolar anomaly in the absence of any remanent magnetisation, into a monopolar anomaly over the magnetic source (akin to a gravity anomaly). 

In geophysics, a magnetic anomaly is a local variation in the Earth’s magnetic field resulting from variations in the chemistry or magnetism of the rocks. Mapping of these variations over an area is valuable in detecting structures obscured by overlying material.

Magnetic anomalies are generally a small fraction of the total magnetic field. The total field ranges from 25,000 to 65,000 nanoteslas (nT). To measure anomalies, magnetometers need a sensitivity of 10 nT or less. There are several types of magnetometers used to measure magnetic anomalies. Metatek uses proton precession magnetometers which measures the strength of the field but not its direction, so it does not need to be oriented. Each measurement takes a second or more. It is used in most ground surveys except for boreholes and high-resolution magnetic gradiometer surveys. For aeromagnetic surveys, Metatek uses Optically pumped magnetometers, which use alkali gases (most commonly rubidium and caesium) have high sample rates and sensitivities of 0.001 nT or less.

Airborne magnetometers detect the change in the Earth’s magnetic field using sensors attached to the aircraft in the form of a “stinger” or historically by towing a magnetometer on the end of a cable.

Magnetotellurics (MT) is an electromagnetic geophysical method for inferring the earth’s subsurface electrical conductivity from measurements of natural geomagnetic and geoelectric field variation at the Earth’s surface.

Investigation depth ranges from 100 m below ground by recording higher frequencies down to 200 km or deeper with long-period soundings. Proposed in Japan in the 1940s, and France and the USSR during the early 1950s, MT is now an international academic discipline and is used in exploration surveys around the world.

Commercial uses include hydrocarbon (oil and gas) exploration, geothermal exploration, carbon sequestration, mining exploration, as well as hydrocarbon and groundwater time lapse monitoring. Research applications include experimentation to further develop the MT technique for, sub-glacial water flow mapping, and earthquake precursor research.

MEMS Accelerometers are used in gradiometers to measure the gradient of the gravitational field. They are microscopic devices incorporating both electronic and moving parts. The are typically are constructed of components between 1 and 100 micrometres in size (i.e., 0.001 to 0.1 mm).

MEMS became practical once they could be fabricated using modified semiconductor device fabrication technologies, which include moulding and plating, wet and dry etching, electrical discharge machining and other technologies capable of manufacturing small devices. They merge at the nanoscale into nanoelectromechanical systems (NEMS) and nanotechnology.

MEMS accelerometers are increasingly present in portable electronic devices and video-game controllers, to detect changes in the positions of these devices.


In satellite remote sensing and imaging, PMC stands for “Post Mission Compensation.” PMC is a technique used to correct the geometric distortions in satellite imagery caused by various factors during the image acquisition process. These factors may include spacecraft attitude variations, orbital changes, atmospheric effects, and sensor characteristics.

Satellite imagery is acquired as the satellite orbits the Earth, and during this process, the position and orientation of the satellite can change slightly due to factors like atmospheric drag, gravitational perturbations, and control system errors. These variations can lead to geometric distortions in the acquired images, affecting the accuracy of the spatial information in the imagery.

Post Mission Compensation is a vital step in satellite image processing to ensure that the imagery accurately represents the Earth’s surface and that spatial measurements and analyses based on the imagery are reliable.

Power Spectral Density or Energy spectral density describes how the energy of a signal or a time series is distributed with frequency.  


A quality management system is a collection of business processes focused on consistently meeting customer requirements and enhancing their satisfaction. It is aligned with an organization’s purpose and strategic direction. It is expressed as the organizational goals and aspirations, policies, processes, documented information, and resources needed to implement and maintain it. QMS has tended to converge with sustainability and transparency initiatives, as both investor and customer satisfaction and perceived quality are increasingly tied to these factors.


In potential field geophysics, regional filtering, also known as Low-Pass filtering, is a common data processing technique used to separate the long-wavelength or large-scale variations in gravity or magnetic field data from the short-wavelength or small-scale anomalies. The purpose of regional filtering is to isolate and remove the background or regional field, leaving behind the residual anomalies that are of geological interest.

The regional field represents the overall distribution of mass or magnetisation in the Earth’s subsurface and is characterised by long-wavelength variations. It is caused by the Earth’s lithospheric structure, major geological features, and other large-scale density or magnetic variations.

On the other hand, the Residual anomalies, or short-wavelength variations, are the shallower localised features caused by subsurface geological structures, mineral deposits, or other smaller-scale magnetic sources.

Remanent Magnetization or Remanence or Residual Magnetism is the magnetization left behind in a ferromagnetic material (such as iron) after an external magnetic field is removed. Colloquially, when a magnet is ‘magnetized’, it has remanence.  The remanence of magnetic materials provides the magnetic memory in rocks and is used as a source of information on the Earth’s past magnetic field in Paleomagnetism. The remanent vector adds to the induced magnetic field vector which can complicate the interpretation of the measured magnetic field.

Residuals are the components of the measured field after the ‘regional’ component has been removed leaving an anomaly of interest to the interpreter. The term residual and regional are relative and their chosen wavelength a function of the geology under investigation.

In the context of aeromagnetic and aerogravity surveying, “run in” and “run out” refer to specific flight segments or manoeuvres that an aircraft performs during the data acquisition process.

Run In: The “run in” is the initial flight segment at the beginning of an aeromagnetic or aerogravity survey line. During the run-in phase, the survey aircraft typically approaches the survey area from outside the survey region. It may involve flying along a designated flight line or track to reach the starting point of the survey grid. The purpose of the run-in is to establish a consistent and controlled flight path before commencing the data acquisition over the actual survey area.

Run Out: The “run out” is the final flight segment at the end of a survey line. After completing the data acquisition over the survey grid, the aircraft flies along specific flight lines or tracks to exit the survey area. The run-out phase allows the aircraft to leave the survey region in a systematic and controlled manner, ensuring that all necessary data is collected and no data gaps are left in the coverage.

During both the run-in and run-out phases, it is essential to maintain a stable and consistent flight pattern to ensure the quality and accuracy of the data. Flight parameters such as altitude, speed, and heading are carefully controlled to meet the survey specifications and ensure that the data collected during these phases aligns seamlessly with the data collected over the survey grid.


A safety management system is designed to manage safety risk in the workplace, occupational safety being defined as the reduction of risk to a level that is as low as is reasonably practicable (ALARP) to prevent people getting hurt.

A SMS provides a systematic way to continuously identify and monitor hazards and control risks while maintaining assurance that these risk controls are effective. SMS can be defined as a businesslike approach to safety. It is a systematic, explicit and comprehensive process for managing safety risks. As with all management systems, a safety management system provides for goal setting, planning, and measuring performance. A safety management system is woven into the fabric of an organization. It becomes part of the culture, the way people undertake their jobs.

In physics, forces (as vectorial quantities) are given as the derivative (gradient) of scalar quantities named potentials. In classical physics pre-Einstein, gravitation was given in the same way, as consequence of a gravitational force (vectorial), given through a scalar potential field, dependent of the mass of the particles. Thus, Newtonian gravity is called a scalar theory. The gravitational force is dependent of the distance r of the massive objects to each other (more exactly, their centre of mass). Mass is a parameter and space and time are unchangeable. This is complicated!

In very simple terms, Scalar Gravity can be considered the vertical effect of gravity as opposed to the measurement of the gravity gradient away from its source.

Seismic methods utilise elastic energy created by natural and artificial sources to create an image of the subsurface. Seismic waves are recorded on geophones (receivers) . Seismic methods are split up into three different methods, reflection, refraction, and surface wave, based on the physical property of the waves being considered. The reflection method looks at reflected energy from sharp boundaries to determine contrasts in density and velocity.  Controlled-source seismology is commonly used to map subsurface structure (faults, salt domes, anticlines and other geologic traps) in petroleum-bearing rocks.

2D Seismic tends to refer to a group of 2D seismic lines acquired individually, as opposed to the multiple closely spaced lines acquired together that constitute 3D seismic data.

Typical receiver line spacing in 3D seismic can range from 300m to over 600m and typical distances between shotpoints and receiver groups is 25m (offshore and internationally) and 34 to 67m (onshore USA). The resultant data set can be ‘cut’ or ‘sliced’ in any direction but still display a well sampled seismic section. The original seismic lines are called in-lines. Lines displayed perpendicular to in-lines are called crosslines. In a properly migrated 3D seismic data set, events are placed in their proper vertical and horizontal positions, providing more accurate subsurface maps than can be constructed on the basis of more widely spaced 2D seismic lines, between which significant interpolation might be necessary. 3D seismic data provides more detailed information about fault distribution and subsurface structures and nowadays is the primary dataset for determining where to drill for oil and gas accumulations.

The Shuttle Radar Topography Mission (SRTM) is an international research effort that obtained digital elevation models on a near-global scale from 56°S to 60°N, to generate the most complete high-resolution digital topographic database of Earth prior to the release of the ASTER GDEM in 2009.

SRTM consisted of a specially modified radar system that flew on board the Space Shuttle Endeavour during the 11-day STS-99 mission in February 2000.

The SRTM 3 or Version 3 (2013) dataset, also known as SRTM Plus, is void-filled with ASTER GDEM and USGS GMTED2010 and has been available in global 1-arcsecond (30 meter) resolution since 2014.

A Stabilised or Inertial Platform, also known as a Gyroscopic Platform, is a system using gyroscopes to maintain a platform in a fixed orientation in space despite the movement of the vehicle that it is attached to.

To allow an FTG to function optimally when being used in airborne surveys, it is very important to locate the instrument on an Aircraft Interface Platform (Stabilised or Inertial Platform). The Stabilised platform comprises a series of servo-driven nested gimbals. This is to minimise the rotational motion of the host aircraft which introduces ‘noise’ into the data. By doing this it is possible to minimise the aircraft motion’s influence on measurements.


During the survey flight, the aircraft flies along predetermined Tie Lines over known geophysical anomalies or calibration points on the ground. These tie lines provide data that can be compared to known values, helping to validate the accuracy of the survey data and calibrate the instruments accordingly.

The regional field represents the overall distribution of mass or magnetisation in the Earth’s subsurface and is characterised by long-wavelength variations. It is caused by the Earth’s lithospheric structure, major geological features, and other large-scale density or magnetic variations.

The Residual anomalies, or short-wavelength variations, are the shallower localised features caused by subsurface geological structures, mineral deposits, or other smaller-scale magnetic sources.

In the context of geophysics, a ‘transform’ refers to a set of mathematical techniques or data processing methods used to analyse and interpret magnetic and gravity survey data. These geophysical surveys are conducted to measure and map variations in the Earth’s magnetic and gravitational fields caused by subsurface, such as rocks with different magnetic properties or densities.


The concept of wavelength is fundamental in understanding the relationship between the horizontal extent (residual) of a geophysical anomaly and its actual depth or subsurface source. In gravity and magnetic surveys, anomalies are variations in the measured field values that can be caused by subsurface geological structures, density variations, or magnetic sources.

Wavelength is a measure of the spatial extent of an anomaly, typically represented as the distance between two consecutive peaks or troughs of the anomaly waveform. In potential field data, the wavelength is inversely proportional to the frequency of the anomaly. Short-wavelength anomalies represent rapid changes in the field values over short distances, while long-wavelength anomalies show gradual changes over larger distances.

This is why a 50 km residual does not necessarily correspond to a 50 km depth in potential field geophysics:

Shallow vs. Deep Sources: The depth of a subsurface source (e.g., a geological structure or magnetic body) that causes a geophysical anomaly is not directly related to the wavelength of the anomaly. In potential field surveys, anomalies of various wavelengths can be caused by sources at different depths. Shallow sources can produce long-wavelength anomalies, while deep sources cannot result in short-wavelength anomalies. The terms shallow, deep, long and short in this contest are relative.

Effects of Topography: In gravity surveys, the topography of the Earth’s surface generates a variations in the gravity field which are superimposed on the acquired data leading to complex patterns in the data.

Data Resolution: The resolution of the geophysical survey data also affects the observed anomaly wavelength. Low-resolution data may not capture short wavelength features accurately, leading to a loss of detail in the anomalies and anomaly aliasing.

The Bureau Gravimetrique International (BGI) – WGM Release 1.0 (2012) World Gravity Map is the first release of high-resolution grids and maps of the Earth’s gravity anomalies (Bouguer, isostatic and surface free-air), computed at global scale in spherical geometry.

WGM2012 gravity anomalies are derived from the available Earth global gravity models EGM2008 and DTU10 and include 1’x1′ resolution terrain corrections derived from ETOPO1 model that consider the contribution of most surface masses (atmosphere, land, oceans, inland seas, lakes, ice caps and ice shelves). These products have been computed by means of a spherical harmonic approach using theoretical developments carried out to achieve accurate computations at global scale.

Get In Touch

Call us on +44 1908 667014, or use the form below and we’ll get in touch as soon as possible.