[go: up one dir, main page]

WO2025117377A1 - Advanced data acquisition and multimodal error correction for enhanced aerospace safety and navigation - Google Patents

Advanced data acquisition and multimodal error correction for enhanced aerospace safety and navigation Download PDF

Info

Publication number
WO2025117377A1
WO2025117377A1 PCT/US2024/057100 US2024057100W WO2025117377A1 WO 2025117377 A1 WO2025117377 A1 WO 2025117377A1 US 2024057100 W US2024057100 W US 2024057100W WO 2025117377 A1 WO2025117377 A1 WO 2025117377A1
Authority
WO
WIPO (PCT)
Prior art keywords
data stream
sensor
temporal data
carrier board
processing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/057100
Other languages
French (fr)
Inventor
Sep MAKHSOUS
Gokul NATHAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Washington
Original Assignee
University of Washington
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Washington filed Critical University of Washington
Publication of WO2025117377A1 publication Critical patent/WO2025117377A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/03Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks

Definitions

  • FIG.1 is an exemplary schematic diagram of a sensor pod system of a vehicle according to some aspects of the present disclosure.
  • FIG.2 illustrates an exemplary schematic diagram of a carrier board in the sensor pod system according to some aspects of the present disclosure.
  • FIG.3 is a schematic diagram of an exemplary schematic diagram of a sensor interface board in the sensor pod system according to some aspects of the present disclosure.
  • FIG.4 depicts an exemplary architecture for an error correction system that may be implemented on the central carrier board according to some aspects of the present disclosure.
  • FIG.5 shows an exemplary GPS sensor system architecture for the sensor pod system according to some aspects of the present disclosure.
  • FIG.6 is an exemplary circuit diagram illustrating a wiring configuration for an interface board according to some aspects of the present disclosure.
  • FIG.7 shows exemplary GPS data that may be processed by a sensor pod system according to some aspects of the present disclosure.
  • FIG.8 depicts an exemplary front face and back face of a carrier board in a sensor pod system according to some aspects of the present disclosure.
  • FIG.9 depicts exemplary interface boards of a sensor pod system according to some aspects of the present disclosure.
  • FIG.10 illustrates an exemplary flow of a process for correcting a temporal data stream of a sensor pod system for a vehicle according to some aspects of the present disclosure.
  • FIG.11 depicts an exemplary environment in which embodiments of the present disclosure can be performed. DETAILED DESCRIPTION [0015]
  • various embodiments will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the embodiments may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described.
  • a sensor pod system for a vehicle e.g., an unmanned aerial vehicle (UAV)
  • UAV unmanned aerial vehicle
  • the error corrections can get used by a flight computer for the vehicle to control navigation of the vehicle.
  • the sensor pod system may have interface boards that each couple to a sensor, such as global positioning system (GPS) sensors.
  • the interface boards may couple to a carrier board in the sensor pod system.
  • the carrier board can receive a temporal data stream from the interface boards (e.g., from the sensors).
  • the carrier board can determine, based on the temporal data stream, error types associated with errors in the KILPATRICK TOWNSEND 788470151 temporal data stream and can generate a corrected temporal data stream that corrects the errors.
  • the flight computer which can also be coupled to the carrier board, can receive the corrected temporal data stream, and can more accurately and precisely navigate the vehicle due to the error corrections.
  • UAVs may encounter flight-related errors while airborne, such as component failures.
  • a UAV may include one or more sensors that can receive data, such as pressure data, GPS position data, pitch data, yaw data, roll data, or any other suitable data that can be used to detect the flight-related errors during flight.
  • a sensor pod system can enable the UAV to detect the errors in real time, process the data, and take corrective actions based on the data.
  • the sensor pod system can include a central carrier board that is a separate board from interface boards that couple to sensors.
  • the interface boards may couple to the central carrier board to provide sensor data (e.g., in a temporal data stream) to the central carrier board.
  • the sensor pod system can be modular. That is, any number or type of sensors may be used by the sensor pod system via the separate interface boards. This can enable rapid switching of sensors and updating of UAVs when new or improved sensors become available. Sensors can get updated or replaced without restructuring hardware or software for the entire sensor pod system.
  • Accurate localization may be crucial for vehicle autonomy, especially for UAVs operating in a three-dimensional environment where complex aerial traffic coordination is necessary.
  • UAV navigation in densely populated or urban environments may be particularly challenging. Achieving the precision required for urban operations may be difficult due to issues such as multipath propagation and non-line-of-sight (NLOS) errors, which may significantly degrade GPS performance.
  • NLOS non-line-of-sight
  • GPS-based localization can achieve lateral accuracies within one meter. But in dense urban areas with obstructions (e.g., buildings, trees, etc.), accuracy may drop sharply, often resulting in horizontal errors of up to hundreds of meters. Vertical errors may further compound this issue, typically exceeding horizontal inaccuracies by at least 50%, and sometimes surpassing 100 meters.
  • Embodiments described herein can involve the carrier board of the sensor pod system utilizing machine learning techniques to enhance the accuracy of low-cost, recreational-grade GPS receivers.
  • Such GPS receivers which may have lower power consumption compared to conventional aerospace-grade GPS receivers, may typically face significant performance limitations that may hinder the realization of fully autonomous flight capabilities. But by applying the machine learning techniques by the carrier board described herein, such shortcomings can be addressed by reducing GPS error.
  • GPS error can be reduced to under 2 meters in real-time on embedded microcontroller platforms during urban UAV operations, all at operational speeds exceeding 10 Hz.
  • the carrier board executing machine learning in an error correction system described herein can offer a robust, lightweight, and scalable solution that can align with the stringent demands of UAM-BVR operations.
  • GPS errors may typically arise primarily from satellite-receiver geometry and clock synchronization issues, which may be captured by a pseudorange (PSR) equation and addressed through iterative Newton’s methods. Traditional approaches may rely on Taylor approximations to linearize and minimize residuals in these equations.
  • PSR pseudorange
  • the error correction systems described herein can capture both temporal dependencies and spatial error patterns to provide real-time error correction for a flight computer.
  • the error correction system can control (e.g., either directly, such as by generating navigation commands based on error-corrected sensor data, or by providing error-corrected sensor data to a flight computer for the UAV) operation, such as navigation, of the UAV using the error-corrected sensor data.
  • error-corrected GPS data can be used to determine navigation commands that cause the UAV to fly in a KILPATRICK TOWNSEND 788470151 particular direction at a particular speed.
  • the error-corrected GPS data can be accurate enough to allow for UAV navigation in urban areas or other areas with poor GPS performance.
  • the error correction system described herein may achieve a residual error of 1.8 meters at 10 Hz for GPS data in urban environments, which is well below a typical 3-meter error threshold.
  • embodiments are described herein in connection with UAVs. However, the embodiments are not limited as such. Instead, the embodiments described herein may similarly apply to other vehicles included manned aircraft, passenger cars, semi-trucks, waterborne vessels, and any other type of vehicle that includes some aspect of computer control and/or electronic signal sensing.
  • FIG.1 is an exemplary schematic diagram of a sensor pod system 100 of a vehicle according to some aspects of the present disclosure.
  • the sensor pod system 100 can include one or more interface boards 102 that may each interface with one or more sensors 104.
  • the interface boards 102 and the sensors 104 are depicted and described in further detail below with respect to FIG.3.
  • the sensor pod system 100 can also include a central carrier board 106.
  • a flight computer 108 e.g., that controls flight of the vehicle housing the sensor pod system 100
  • the interface boards 102 can each be coupled to the central carrier board 106.
  • the central carrier board 106 is depicted and described in further detail below with respect to FIG.2.
  • the sensors 104 can transmit temporal sensor data to the interface boards 102.
  • the interface boards 102 can send the temporal data to the central carrier board 106.
  • the central carrier board 106 can execute one or more computing processes, such as executing machine learning models, which can correct temporal data received from the sensors 104 via the interface board 102.
  • the corrected temporal data can be transmitted to a data stream 110 of a flight computer 108 that controls navigation of the vehicle.
  • the corrected temporal data can, for example, be used by the flight computer 108 to generate output signals that can be sent to a separate flight computer or directly to a process node or to a carrier board for additional processing.
  • FIG.2 illustrates an exemplary schematic diagram of a central carrier board 106 in the sensor pod system according to some aspects of the present disclosure.
  • the central carrier board 106 may be expansible and can be connected to one or more sensor interface boards, such as the interface boards 102 of FIG.1.
  • the central carrier board 106 can KILPATRICK TOWNSEND 788470151 include a core processor 202 that can execute one or more computing processes.
  • the core processor 202 may execute a first machine learning model 204a, which in some examples can be a multi-modal ellipsoidal correction algorithm (MEECA).
  • MEECA multi-modal ellipsoidal correction algorithm
  • the first machine learning model 204a may generate corrected temporal data.
  • the first machine learning model 204a may implement elliptical fitting and can handle multimodal sensory inputs (e.g., from multiple sensors connected to interface boards that are coupled to the central carrier board 106).
  • the first machine learning model 204a may be capable of capturing and modeling complex relationships within dynamic, evolving environments.
  • the first machine learning model 204a may be used to train a second machine learning model 204b.
  • the second machine learning model 204b may be a long short-term memory (LSTM) recurrent neural network (RNN) algorithm.
  • the second machine learning model 204b may further correct the temporal data and in some examples may generate state estimation vectors.
  • the output from the second machine learning model 204b (and/or the first machine learning model 204a), such as the corrected temporal stream and the state estimation vectors, can be transmitted as a data stream 110 to a flight computer, such as the flight computer 108 of FIG.1.
  • the flight computer can use the corrected temporal stream and/or the state estimation vectors to control a vehicle that includes the sensor pod system (e.g., a UAV).
  • the central carrier board 106 can include a serial communications module 206, such as a controller area network (CAN) module with traffic management.
  • the central carrier board 106 can include a nodal architecture in which traffic can have priority management. That is, the central carrier board 106 can dynamically choose which messages to receive and process first.
  • the central carrier board 106 can include both analog and digital hardware that can perform analog and digital processes on the same board.
  • the central carrier board can include current digital-to-analog converters (DACs) 212, voltage DACs 214, current analog-to-digital converters (ADCs) 216, and voltage ADCs 218.
  • the central carrier board 106 can include a control output bus 208 and a feedback input bus 210, each of which may be expansible.
  • the core processor 202 can receive sensor parameters 220a-n from interface boards (e.g., sensors coupled to the interface boards) via the feedback input bus 210.
  • the serial communications module 206 can control delivery of the sensor parameters 220a-n to the core processor 202.
  • the serial communications module 206 may also control output of commands KILPATRICK TOWNSEND 788470151 from the core processor 202 to the interface boards via the control output bus 208. In this way, the core processor 202 can interface with the control output bus 208 and the feedback input bus 210. [0029] In some examples, the core processor 202 may perform one or more cleaning operations to clean the sensor parameters 220a-n. For example, signals received via the feedback input bus 210 may be processed with a low pass-filter and/or a high-pass filter to remove certain portions of the signal. In some examples, the core processor 202 may resample the signal in a spatial domain.
  • the core processor 202 can use the first machine learning model 204a and/or the second machine learning model 204b to fit an ellipsoid to a parameter 220 to analyze, separate, and measure the different erors of interest that may be associated with the parameter 220.
  • the errors may be observed in the spatial domain but can have a direct translation to the temporal domain (or vice versa).
  • the core processor 202 can receive the errors and process the errors as well as temporal data therewith via the first machine learning model 204a to generate a corrected temporal stream of data for the sensor parameters 220a-n.
  • the corrected temporal stream of data and/or the error information can be used by the second machine learning model 204b to generate state estimation vectors (e.g., using a recurrent neural network (RNN)).
  • the second machine learning model 204b may also generate missing data from failed sensors using a generative neural network (GNN).
  • THe output from the machine learning models 204a-b can be output as a data stream 110 for use by a flight computer (e.g., the flight computer 108 of FIG.1).
  • the corrected temporal stream of data may be a multimodal stream of data including data from multiple sensors. The elliptical fitting of the temporal stream of data performed by the first machine learning model 204a can identify relationships between data from different sensors.
  • FIG.3 is a schematic diagram of an exemplary schematic diagram of one or more sensor interface boards 102 in the sensor pod system according to some aspects of the present disclosure.
  • the interface boards 102 can include a microcontroller 302 and one or more sensors 104 that may be communicatively coupled to the interface boards 102.
  • the microcontroller 302 and the sensor 104 may be housed on the same board. Additionally or alternatively, the microcontroller 302 may be coupled to the sensor 104 via an electrical interface, such as a screw terminal, a UF1 connector, or any other suitable electrical interface that can electrically connect the sensor 104 to the microcontroller 302.
  • the interface board 102 may perform certain processing steps on sensor data received from the sensors 104.
  • the KILPATRICK TOWNSEND 788470151 processing steps may include digital and/or analog processing steps and may be performed on the same board.
  • the processing steps can include power management, preconditioning, filtering, signal sampling techniques, and any other suitable processing steps.
  • the interface board 102 may in some examples perform additional spatial resampling operations in addition to spatial resampling operations that may be performed on a main board to which the interface board 102 may be communicatively coupled.
  • the microcontroller 302 can be coupled to a protection circuit 306 that may protect the microcontroller 302 from power surges that may be associated with the sensors 104.
  • the microcontroller 302 may also be coupled to a monitoring circuit 304 that may house FPGAs or other processing devices that can be dynamically activated.
  • the monitoring circuit 304 may generate and transmit correction commands to components on the interface board 102 or any other systems or boards that may be coupled to the interface board 102.
  • the commands may be transmitted to a central carrier board (e.g., the central carrier board 106 of FIGS.1-2) to cause certain corrective actions to be taken, such as to correct errors in a temporal data stream of sensor data from the sensors 104.
  • the temporal data stream and/or the commands can be transmitted to the central carrier board via the feedback output bus 312 (e.g., to be received by the feedback input bus 210 of FIG.2).
  • the microcontroller 302 may receive commands to perform certain actions via the control input bus 310 (e.g., from the control output bus 208 of the central carrier board 106 of FIG.2).
  • the data and/or commands transmitted or received by the interface board 102 via the control input bus 310 and/or the feedback output bus 312 can involve encryption.
  • the microcontroller 302 can encrypt data or commands and can decrypt encrypted communication received from the central carrier board (e.g., using standard data protocols).
  • the interface boards 102 can receive pins in a reversible configuration.
  • the interface boards 102 may include a 64-pin configuration.
  • FIG.4 depicts an exemplary architecture for an error correction system 403 that may be implemented on the central carrier board 106 according to some aspects of the present disclosure.
  • the central carrier board 106 can be coupled to a GPS receiver 402 (e.g., via the interface board 102 of FIGS.1 and 3).
  • the GPS receiver 402 may receive a GPS signal 404 from a GPS satellite 406.
  • the GPS signal 404 can in some examples be a L1/C GPS signal.
  • the central carrier board 106 may include a microcontroller 408, which can in some examples be the core processor 202 of FIG.2.
  • the microcontroller 408 may include a GPS parser 410 that can parse the GPS signal 404 to generate raw GPS data 412.
  • the GPS parser 410 may perform NMEA processing to parse and decode NMEA sentences from the GPS signal to transform the GPS signal 404 into physical values of raw GPS data 412.
  • the GPS signal 404 and/or the raw GPS data 412 may be stored by the microcontroller 408 in local storage 414.
  • the raw GPS data 412 can be plotted over time to create a temporal signal, which can be time series data corresponding to a measured sensor output or estimated physical variable.
  • the raw GPS data 412 can be concatenated with historical GPS data 416 and provided as input to the error correction system 403.
  • the error correction system 403 can, in some examples, include an environment detection model 418 that can be an environment classifier.
  • the environment detection model 418 may classify the operational environment type, such as being an open field, semi-urban, urban, etc. The classification can inform the error correction system 403 which environment-specific model to apply for optimal sensor corrections (e.g., GPS signal corrections), which can increase accuracy and reliability in different conditions. By dynamically loading environment-specific models, the environment detection model 418 can reduce computational demands, making the operations of the central carrier board 106 suitable for real-time applications on low-power devices. [ 0034]
  • the environment detection model 318 (also called . herein) can analyze a sliding window of sensor data (e.g., raw GPS data 412), called data window , by comparing the current raw GPS data 412 to historical patterns (e.g., historical GPS data 416) to ascertain sensor and measurement reliability.
  • the environment detection model 418 can be or can include a deep neural network (DNN) to be used as a classifier.
  • DNN deep neural network
  • Each instance of measurement in the raw GPS data 412 can be structured as a data frame . Multiple data frames can be combined to form , whose length can be controlled by the numerical parameter . Then, can be a 3D matrix where data frames are stacked from n ewest to oldest: , , ... , where is the latest data frame and is the oldest data frame, and is the length of the window. From each data frame, a different matrix can be constructed by the environment KILPATRICK TOWNSEND 788470151 where is some threshold vector for each column derived from historical observations (e.g., the historical GPS data 416) and can be application specific.
  • KILPATRICK TOWNSEND 788470151 is some threshold vector for each column derived from historical observations (e.g., the historical GPS data 416) and can be application specific.
  • the environment detection model 418 can evaluate each GPS sensor stream individually and can assign first a reliability l abel to each sensor.
  • a reliability l abel can be set to , , ... , .
  • These threshold can be specific to each parameter and can be used to evaluate the reliability of the data frames.
  • Each column of can be compared by the environment detection model 418 against the corresponding threshold in .
  • a data frame can be considered reliable if most of its columns are below the threshold: w here 1 .
  • a tunable t hreshold parameter can be set at 0.5 .
  • a data window may be deemed reliable if most of its frames are reliable. If 0, then the latest data frame can be discarded, and the next window can be re-evaluated. If consecutive data windows are unreliable, the process may be restarted. [0035] If the data window is reliable, it ca be used by the environment detection model 418 t o generate an output , . By integrating data from all sensors coupled to the central carrier board 106 (e.g., via interface boards 102), the environment detection model 418 can classify the environment, preventing a single poorly performing sensor from misleading the error correction system 403. The environment detection model 418 can determine and filter which GPS data to pass to a machine learning model 428 in the error correction system 403.
  • the error correction system 403 can also include an inverse variance weighted filter 420 that can prioritize accurate data by dynamically weighting each GPS measurement based on its variance. This statistical approach can reduce the influence of noisy data, allowing the error correction system 403 to focus on more reliable inputs. Unlike traditional filtering techniques, such as low-pass filters (LPFs) or Kalman filters (KFs), which may involve system-specific tuning and external data, the inverse variance weighted filter 420 can operate without pre-configuration, making it more adaptable to urban environments.
  • LPFs low-pass filters
  • KFs Kalman filters
  • the inverse variance weighted filter 420 can employ a dynamically sized window of values denoted as characterized by a size .
  • the variance of sensor measurements for each parameter from a sensor can be analyzed.
  • the sensor measurements at time for all sensors can be represented as where: , , , ... where parameters of an individual sensor (e.g., GPS receiver 402) at time , the measurements across all parameters are denoted in column vector as .
  • Corrected sensor measurements e.g., corrected GPS data 426) can be obtained by the inverse variance weighted filter 420 by adding a correction matrix to . Then for each parameter within sensor , the inverse variance weighted filter 420 can calculate the variance over the window .
  • the weight for each parameter in each sensor can be inversely proportional to the variance: where is some parameter-specific tunable gain parameter, initially set to 1, and is a small constant added to prevent division by zero.
  • the deviation of the current measurement from its mean can be given by: KILPATRICK TOWNSEND 788470151 is a correction element in , which is the correction matrix (e.g., applied corrections 424) that can be applied to the latest sensor data (e.g., corrected GPS data 426).
  • the inverse variance weighted filter 420 can be scalable and adaptable to varying sensor counts and parameters. The inverse variance weighted filter 420 can detect and correct outliers early, improving data quality for a machine learning model 428 in the error correction system 403.
  • the inverse variance weighted filter 420 can ensure verifiable inputs, enhance system reliability, and aid compliance with aerospace certification standards.
  • the traceable data treatment performed by the inverse variance weighted filter 420 can boost transparency, which may be critical for regulatory adherence before applying further machine learning techniques (e.g., by the machine learning model 428).
  • the machine learning model 428 can include or be an example of the first machine learning model 204a and/or the second machine learning model 204b of FIG. 2.
  • the machine learning model 428 can be one of a set of machine learning models that was selected by the environment detection model 418 based on a classification of the environment (e.g., as detected from the raw GPS data 412).
  • the machine learning model 428 can in some examples include a recurrent neural network (RNN) and a deep neural network (DNN).
  • the machine learning model 428 can process inputs within a structured three-dimensional matrix, denoted as , which can encapsulate multi-sensor time-series data (e.g., including the corrected GPS data 426 from the GPS receiver 402 and any other sensor data from sensors coupled to the central carrier board 106).
  • the matrix can be defined as: where represents the number of sensors (e.g., GPS sensors or other types of sensors) coupled to the central carrier board 106, signifies the number of monitored parameters per sensor, and is the dynamically tunable window size corresponding to the number of sequential data points considered for each parameter.
  • a starting value of 25 can be predicated on empirical observations of data indicating that a 2.5 second window can typically include sufficient data to generate counteracting error signals in an KILPATRICK TOWNSEND 788470151 urban environment. This can allow the machine learning model 428 to capture both immediate and delayed error effects while reducing memory footprint.
  • the input to the machine learning model 428 normalized based on individual predictor characteristics, can be flattened to a 2D matrix , by . and fed to an LSTM of the machine learning model 428.
  • the LSTM can predict altitude, angular, and offset errors. where can be identified temporal GPS error at the instant of measurement.
  • the LSTM which can be a specialized form of RNN, can capture the temporal dynamics of the GPS errors.
  • the machine learning model 428 can include a DNN model that can composite the error output from the LSTM and handle non-sequential errors. Subsequently, these predictions can be processed through a DNN model, that can identify the instantaneous aperiodic and DC errors in GPS, , as shown below: where are a set of parameters that can include the associated weights for the input, connected, and output layers of the model, along with any associated bias values as learned by the model during the training stage.
  • the output of the machine learning model 428 can be estimated error 430 which can include a corrected temporal stream of the GPS signal 404 (or any other sensor data received from sensors coupled to the central carrier board 106).
  • the estimated error 430 can be output to a flight computer (e.g., the flight computer 108 of FIG. 1), which can use the estimated error 430 and the corrected temporal stream to accurately navigate a vehicle that includes the central carrier board 106.
  • FIG.5 shows an exemplary GPS system architecture 500 for the sensor pod system according to some aspects of the present disclosure.
  • the GPS system architecture 500 can include a first microcontroller 302a (e.g., of a first interface board) coupled to multiple (e.g., three) GPS sensors 502.
  • a second microcontroller 302b (e.g., of a second interface KILPATRICK TOWNSEND 788470151 board) can be coupled to multiple (e.g., three) GPS sensors 502.
  • the microcontrollers 302a-b can interface with a microprocessor 504 of a central carrier board.
  • the microprocessor 504 can be an example of the core processor 202 of FIG.2.
  • the microprocessor 504 may process simultaneous readings from some or all of the six GPS sensors 502. Examples of the microcontrollers 302a-b can include the ATmega2560 microchip. Examples of the GPS sensors 502 can include the MTK3339 PA6H GPS module.
  • FIG.6 is an exemplary circuit diagram 600 illustrating a wiring configuration for an interface board 102 according to some aspects of the present disclosure.
  • the wiring configuration shown in the circuit diagram 600 can include five sensors 602 connected to a microcontroller 302 of the interface board 102, but in other examples any number of sensors 602 may be included and connected.
  • the microcontroller 302 can be a Portenta H7 microprocessor.
  • the microcontroller 302 can be an chicken MEGA 2560 microprocessor.
  • the circuit diagram 600 can be powered by a battery power supply 604, which in some examples can include three 9V batteries, four 9V batteries, or any number or voltage of batteries that are placed in parallel.
  • the battery power supply 604 may in some examples include rechargeable batteries.
  • the circuit diagram 600 can additionally include a ground 606 connected to the battery power supply 604.
  • the circuit diagram 600 can, in some examples, include an SD card 608 or other form of non-volatile memory for storage of sensor data.
  • some or all of the sensors 602 can be GPS sensors that can include internal passive antennas for collecting GPS sensor data.
  • the sensors 602 can be wireless GPS receivers.
  • the circuit diagram 600 may additionally include an antenna system to communicate information (e.g., sensor data from the sensors 602) over Wi-Fi, Bluetooth, or any other suitable network.
  • FIG.7 shows exemplary GPS data 700 that may be processed by a sensor pod system according to some aspects of the present disclosure.
  • Unprocessed sensor values may have significant errors, particularly in environments such as urban environments.
  • the sensor values reported by GPS sensors may vary significantly from the actual position 702.
  • error corrected values 704 can be determined that are significantly closer to the actual position 702 KILPATRICK TOWNSEND 788470151 compared to the original, unprocessed sensor values reported by GPS sensors on the sensor pod system.
  • FIG.8 depicts an exemplary front face 802 and back face 804 of a carrier board in a sensor pod system according to some aspects of the present disclosure, such as the central carrier board 106 of FIGS.1, 2, and 4.
  • the front face 802 can include an adaptor 806 that can couple to a flight computer, such as the flight computer 108 of FIG.1. Any flight computer 108 may be coupled to the carrier board with a suitable adaptor.
  • the front face 802 of the carrier board can include slots 808 for interface boards (e.g., the interface boards 102 of FIGS.1 and 3).
  • the carrier board may support any number or type of sensors via the interface boards coupled via the slots 808.
  • the back face 804 of the carrier board can include power sources 810 that can power components of the carrier board. In some examples, five power sources 810 can provide up to 80 watts of power to the carrier board. In another example, eight power sources 810 can provide up to 110 watts of power to the carrier board.
  • FIG.9 depicts exemplary interface boards 902a-b of a sensor pod system according to some aspects of the present disclosure.
  • Interface boards 902a-b can be examples of the interface boards 102 of FIGS.1, 3, and 6.
  • the interface boards 902a-b can include buses 904a-b that can be examples of the control input bus 310 and/or the feedback output bus 312 of FIG.3. In some examples, the buses 904a-b may also provide power to the interface boards 902a-b.
  • the buses 904a-b can couple to a central carrier board 106, such as to the slots 808 of FIG.8.
  • Sensors 906a-b can couple to the interface boards 902a-b.
  • the sensors 906a-b may be directly mounted to the interface boards 902a-b themselves, or may have a wired connection to the interface boards 902a-b.
  • the interface boards 902a-b can each include a microcontroller 908a-b that can control the flow of data (e.g., from the sensors 906a-b) to a carrier board (e.g., via the buses 904a-b).
  • FIG.10 illustrates an exemplary flow of a process 1000 for correcting a temporal data stream of a sensor pod system for a vehicle according to some aspects of the present disclosure.
  • the process 1000 of FIG.10, and any other processes described herein are illustrated as logical flow diagrams, the operation of which represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof.
  • the operations may represent computer- KILPATRICK TOWNSEND 788470151 executable instructions stored on one or more non-transitory computer-readable storage media that, when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures and the like that perform particular functions or implement particular data types.
  • Block 1002 of the process 1000 involves a computer system receiving from one or more interface boards in a sensor pod system, a temporal data stream.
  • the computer system can in some examples be housed on a central carrier board of the sensor pod system.
  • the one or more interface boards can be separate from the central carrier board and coupled to the central carrier board.
  • Sensor data from one or more sensors coupled to the one or more interface boards can be received by the interface boards.
  • the interface board can process the sensor data to define sensor parameters. These sensor parameters can be provided to a feedback bus that can provide the sensor parameters as a temporal data stream to the central carrier board.
  • the one or more interface boards can additionally include a control input bus that can receive inputs from the carrier board and a microcontroller device communicatively coupled to the control input bus, the feedback bus, and the one or more sensors.
  • the microcontroller device can receive and process the sensor data and can provide the sensor parameters to the feedback bus for delivery to the carrier board.
  • the one or more sensors and the microcontroller device can be housed on a single board.
  • Block 1004 of the process 1000 involves the computer system determining, based on the temporal data stream, one or more error types associated with one or more errors KILPATRICK TOWNSEND 788470151 associated with the temporal data stream.
  • the one or more error types can be determined via an error correction system.
  • the error correction system may also generate the corrected temporal error stream based on the identified error types.
  • the error correction system can be used to train an onboard machine learning model based on the corrected temporal data.
  • the onboard machine learning model can in some examples be a recurrent neural network.
  • the trained onboard machine learning model can generate one or more state estimation vectors (e.g., orientation, velocity, or other state information) of the sensor pod system based on the corrected temporal data stream.
  • Block 1006 of the process 1000 involves the computer system generating, based on the temporal data stream and the one or more error types, a corrected temporal data stream that corrects the one or more errors.
  • the onboard machine learning model may be a generative neural network (GNN).
  • the onboard machine learning model may determine, based on the one or more error types, missing data from a sensor failure in the temporal data stream.
  • the onboard machine learning model can be used to generate, based on the temporal data stream and the one or more error types, the corrected temporal data stream that includes the missing data.
  • the corrected temporal data stream can be transmitted toa flight computer.
  • the flight computer can use the corrected temporal data stream to control a vehicle that includes the sensor pod system.
  • the vehicle can, in some examples, be a UAV.
  • the flight computer may also receive the one or more state estimation vectors and may control the vehicle based on the one or more state estimation vectors.
  • FIG.11 depicts example components of a computer system 1100 in which embodiments of the present disclosure can be performed.
  • the computer system 1100 can be used as a node in a computer network, where this node provides one or more computing components of an underlay network of the computer network and/or one or more computing components of an overlay network of the computer network. Additionally or alternatively, the components of the computer system 1100 can be used in an endpoint. Although the components of the computer system 1100 are illustrated as belonging to a same system, the computer system 1100 can also be distributed (e.g., between multiple user devices).
  • the computer system 1100 can be an example of components of the sensor pod system 100 (e.g., flight computer 108, central carrier board 106, interface board 102, sensors 104), core processor 202, microcontroller 302, microcontroller 408, microprocessor 504, microcontrollers 908a-b, or computer system 1100.
  • KILPATRICK TOWNSEND 788470151 [0054]
  • the computer system 1100 can include at least a processor 1102, a memory 1104, a storage device 1106, input/output peripherals (I/O) 1108, communication peripherals 1110, and an interface bus 1112.
  • the interface bus 1112 is configured to communicate, transmit, and transfer data, controls, and commands among the various components of the computer system 1100.
  • the memory 1104 and the storage device 1106 include computer-readable storage media, such as RAM, ROM, electrically erasable programmable read-only memory (EEPROM), hard drives, CD-ROMs, optical storage devices, magnetic storage devices, electronic non-volatile computer storage; for example, Flash® memory, and other tangible storage media. Any of such computer-readable storage media can be configured to store instructions or program codes embodying aspects of the disclosure.
  • the memory 1104 and the storage device 1106 also include computer-readable signal media.
  • a computer-readable signal medium includes a propagated data signal with computer-readable program code embodied therein. Such a propagated signal takes any of a variety of forms including, but not limited to, electromagnetic, optical, or any combination thereof.
  • a computer-readable signal medium includes any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use in connection with the computer system 1100.
  • the memory 1104 includes an operating system, programs, and applications.
  • the processor 1102 is configured to execute the stored instructions and includes, for example, a logical processing unit, a microprocessor, a digital signal processor, and other processors.
  • the memory 1104 and/or the processor 1102 can be virtualized and can be hosted within another computer system of, for example, a cloud network or a data center.
  • the I/O peripherals 1108 include user interfaces, such as a keyboard, screen (e.g., a touch screen), microphone, speaker, other input/output devices, and computing components, such as graphical processing units, serial ports, parallel ports, universal serial buses, and other input/output peripherals.
  • the I/O peripherals 1108 are connected to the processor 1102 through any of the ports coupled to the interface bus 1112.
  • the communication peripherals 1110 are configured to facilitate communication between the computer system 1100 and other systems over a communications network and include, for example, a network interface controller, modem, wireless and wired interface cards, antenna, and other communication peripherals.
  • the computer system 1100 can include a variety of data stores and other memory and storage media as discussed above.
  • KILPATRICK TOWNSEND 788470151 storage medium local to (and/or resident in) one or more of the computers or remote from any or all of the computers across the network.
  • the information may reside in a storage-area network (“SAN”) familiar to those skilled in the art.
  • SAN storage-area network
  • any necessary files for performing the functions attributed to the computers, servers, or other network devices may be stored locally and/or remotely, as appropriate.
  • each such device can include hardware elements that may be electrically coupled via a bus, the elements including, for example, at least one central processing unit (“CPU”), at least one input device (e.g., a mouse, keyboard, controller, touch screen, or keypad), and at least one output device (e.g., a display device, printer, or speaker).
  • CPU central processing unit
  • input device e.g., a mouse, keyboard, controller, touch screen, or keypad
  • output device e.g., a display device, printer, or speaker
  • Such a system may also include one or more storage devices, such as disk drives, optical storage devices, and solid-state storage devices, such as random-access memory (“RAM”) or read-only memory (“ROM”), as well as removable media devices, memory cards, and/or flash cards.
  • RAM random-access memory
  • ROM read-only memory
  • Such devices can include a computer-readable storage media reader, a communications device (e.g., a modem, a network card (wireless or wired)), an infrared communication device, etc.), and working memory as described above.
  • the computer- readable storage media reader can be connected with, or configured to receive, a computer- readable storage medium, representing remote, local, fixed, and/or removable storage devices as well as storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information.
  • the system and various devices also typically will include a number of software applications, modules, services, or other elements located within at least one working memory device, including an operating system and application programs, such as a client application or Web browser.
  • Storage media computer readable media for containing code, or portions of code can include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program KILPATRICK TOWNSEND 788470151 modules, or other data, including RAM, ROM, Electrically Erasable Programmable Read- Only Memory (“EEPROM”), flash memory or other memory technology, Compact Disc Read-Only Memory (“CD-ROM”), digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a system device.
  • RAM random access memory
  • ROM read-only memory
  • EEPROM Electrically Erasable Programmable Read- Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • DVD digital versatile disk
  • magnetic cassettes magnetic
  • Suitable computing devices include multipurpose microprocessor-based computing systems accessing stored software that programs or configures the computing system from a general purpose KILPATRICK TOWNSEND 788470151 computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device. [0062] Embodiments of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the examples above can be varied—for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel.
  • Conditional language used herein such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain examples include, while other examples do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more examples or that one or more examples necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular example.
  • Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood within the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain examples require at least one of X, at least one of Y, or at least one of Z to each be present. [0065] Use herein of the word “or” is intended to cover inclusive and exclusive OR conditions.
  • a or B or C includes any or all of the following alternative combinations as appropriate for a particular usage: A alone; B alone; C alone; A and B only; A and C only; B and C only; and all three of A and B and C.
  • the use of the terms “a” and “an” and “the” and similar referents in the context of describing the disclosed examples (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context.
  • the terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth.
  • the term “or” is used in its KILPATRICK TOWNSEND 788470151 inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
  • the use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps.
  • the term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening.
  • the methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed examples. Similarly, the example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed examples.
  • KILPATRICK TOWNSEND 788470151 Certain processes are described and claimed herein. The operation of each block represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof. In the context of computer instructions, the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular data types.
  • Some or all of the processed described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof.
  • the code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors.
  • the computer-readable storage medium may be non-transitory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

Techniques described herein are directed to performing error correction for enhanced aerospace safety and navigation. For example, a carrier board used in a sensor pod system for a vehicle can include a feedback bus, a control output bus, and a processing device. the feedback bus can receive inputs from interface boards. The control output bus can provide outputs to the interface boards. The processing device can control a serial communications module that can manage operation of the feedback bus and the control output bus. The processing device can also implement an error correction system that can analyze sensor parameters received over the feedback bus. The error correction system can generate analyzed output including error corrections for the sensor parameters for outputting by the control output bus. The error correction system may determine, based on the analyzed sensor parameters, error types associated with errors with the inputs from the interface boards.

Description

PCT PATENT APPLICATION Attorney Docket No.080097-1468088 (005010WO) Client Ref. No.49918.02WO2 ADVANCED DATA ACQUISITION AND MULTIMODAL ERROR CORRECTION FOR ENHANCED AEROSPACE SAFETY AND NAVIGATION CROSS-REFERENCES TO RELATED APPLICATIONS [0001] This application claims the benefit of U.S. Provisional Application No.63/603,099 filed November 27, 2023, the entire contents of which are hereby incorporated for all purposes in their entirety. BACKGROUND [0002] Autonomous unmanned aerial vehicles (UAVs) may utilize global positioning system (GPS) data for navigation. Achieving accurate three-dimensional geo-localization in urban settings may be challenging due to poor GPS performance. For instance, recreational- grade GPS systems, which may be commonly used in drones, while cost-effective, may generate significant data errors in urban environments. Conversely, traditional aerospace- grade GPS receivers may be costly, power-intensive, and heavy, making them unsuitable for drone deployment. BRIEF DESCRIPTION OF THE DRAWINGS [0003] The drawings illustrate several embodiments of the present disclosure, wherein identical reference numerals refer to identical or similar elements or features in different views or embodiments shown in the drawings. [0004] FIG.1 is an exemplary schematic diagram of a sensor pod system of a vehicle according to some aspects of the present disclosure. [0005] FIG.2 illustrates an exemplary schematic diagram of a carrier board in the sensor pod system according to some aspects of the present disclosure. [0006] FIG.3 is a schematic diagram of an exemplary schematic diagram of a sensor interface board in the sensor pod system according to some aspects of the present disclosure. [0007] FIG.4 depicts an exemplary architecture for an error correction system that may be implemented on the central carrier board according to some aspects of the present disclosure.
1 KILPATRICK TOWNSEND 788470151 [0008] FIG.5 shows an exemplary GPS sensor system architecture for the sensor pod system according to some aspects of the present disclosure. [0009] FIG.6 is an exemplary circuit diagram illustrating a wiring configuration for an interface board according to some aspects of the present disclosure. [0010] FIG.7 shows exemplary GPS data that may be processed by a sensor pod system according to some aspects of the present disclosure. [0011] FIG.8 depicts an exemplary front face and back face of a carrier board in a sensor pod system according to some aspects of the present disclosure. [0012] FIG.9 depicts exemplary interface boards of a sensor pod system according to some aspects of the present disclosure. [0013] FIG.10 illustrates an exemplary flow of a process for correcting a temporal data stream of a sensor pod system for a vehicle according to some aspects of the present disclosure. [0014] FIG.11 depicts an exemplary environment in which embodiments of the present disclosure can be performed. DETAILED DESCRIPTION [0015] In the following description, various embodiments will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the embodiments may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described. [0016] Techniques described herein are directed to, among other things, a sensor pod system for a vehicle (e.g., an unmanned aerial vehicle (UAV)) that can analyze and correct errors in sensor parameters. The error corrections can get used by a flight computer for the vehicle to control navigation of the vehicle. The sensor pod system may have interface boards that each couple to a sensor, such as global positioning system (GPS) sensors. The interface boards may couple to a carrier board in the sensor pod system. The carrier board can receive a temporal data stream from the interface boards (e.g., from the sensors). The carrier board can determine, based on the temporal data stream, error types associated with errors in the KILPATRICK TOWNSEND 788470151 temporal data stream and can generate a corrected temporal data stream that corrects the errors. The flight computer, which can also be coupled to the carrier board, can receive the corrected temporal data stream, and can more accurately and precisely navigate the vehicle due to the error corrections. [0017] Turning now to a particular example, UAVs may encounter flight-related errors while airborne, such as component failures. In some examples, a UAV may include one or more sensors that can receive data, such as pressure data, GPS position data, pitch data, yaw data, roll data, or any other suitable data that can be used to detect the flight-related errors during flight. A sensor pod system can enable the UAV to detect the errors in real time, process the data, and take corrective actions based on the data. The sensor pod system can include a central carrier board that is a separate board from interface boards that couple to sensors. The interface boards may couple to the central carrier board to provide sensor data (e.g., in a temporal data stream) to the central carrier board. Because the sensors are integrated with or coupled to a separate board than the central carrier board, the sensor pod system can be modular. That is, any number or type of sensors may be used by the sensor pod system via the separate interface boards. This can enable rapid switching of sensors and updating of UAVs when new or improved sensors become available. Sensors can get updated or replaced without restructuring hardware or software for the entire sensor pod system. [0018] Accurate localization may be crucial for vehicle autonomy, especially for UAVs operating in a three-dimensional environment where complex aerial traffic coordination is necessary. UAV navigation in densely populated or urban environments may be particularly challenging. Achieving the precision required for urban operations may be difficult due to issues such as multipath propagation and non-line-of-sight (NLOS) errors, which may significantly degrade GPS performance. In optimal open-sky conditions, GPS-based localization can achieve lateral accuracies within one meter. But in dense urban areas with obstructions (e.g., buildings, trees, etc.), accuracy may drop sharply, often resulting in horizontal errors of up to hundreds of meters. Vertical errors may further compound this issue, typically exceeding horizontal inaccuracies by at least 50%, and sometimes surpassing 100 meters. Such limitations may be significant for urban air motility (UAM) and beyond- visual-range (BVR) applications, where reliable and precise localization may be critical. The stringent requirements of UAV operations, such as low power consumption, computational efficiency, and high-speed processing may additionally complicate UAV navigation. KILPATRICK TOWNSEND 788470151 [0019] Embodiments described herein can involve the carrier board of the sensor pod system utilizing machine learning techniques to enhance the accuracy of low-cost, recreational-grade GPS receivers. Such GPS receivers, which may have lower power consumption compared to conventional aerospace-grade GPS receivers, may typically face significant performance limitations that may hinder the realization of fully autonomous flight capabilities. But by applying the machine learning techniques by the carrier board described herein, such shortcomings can be addressed by reducing GPS error. In a particular example, GPS error can be reduced to under 2 meters in real-time on embedded microcontroller platforms during urban UAV operations, all at operational speeds exceeding 10 Hz. By specifically targeting urban challenges such as multipath effects, the carrier board executing machine learning in an error correction system described herein can offer a robust, lightweight, and scalable solution that can align with the stringent demands of UAM-BVR operations. [0020] GPS errors may typically arise primarily from satellite-receiver geometry and clock synchronization issues, which may be captured by a pseudorange (PSR) equation and addressed through iterative Newton’s methods. Traditional approaches may rely on Taylor approximations to linearize and minimize residuals in these equations. However, such traditional methods may be highly susceptible to environmental noise, often resulting in significant inaccuracies in urban environments. In contrast, techniques described herein can improve state estimation capabilities by using machine learning onboard a carrier board of a sensor pod system for a vehicle such as a UAV to analyze the contribution of various error sources in a GPS, such as clock errors, ionospheric and tropospheric impact, multi-path and NLOS error contributions. [0021] Existing performance-enhancing post-processing techniques for GPS sensors are conventionally limited by the real-time requirements and the noise, power, and computational constraints inherent to UAV operations. The error correction systems described herein, with their embedded microcontrollers capable of running machine learning onboard a UAV, can capture both temporal dependencies and spatial error patterns to provide real-time error correction for a flight computer. The error correction system can control (e.g., either directly, such as by generating navigation commands based on error-corrected sensor data, or by providing error-corrected sensor data to a flight computer for the UAV) operation, such as navigation, of the UAV using the error-corrected sensor data. For example, error-corrected GPS data can be used to determine navigation commands that cause the UAV to fly in a KILPATRICK TOWNSEND 788470151 particular direction at a particular speed. The error-corrected GPS data can be accurate enough to allow for UAV navigation in urban areas or other areas with poor GPS performance. For instance, in some examples the error correction system described herein may achieve a residual error of 1.8 meters at 10 Hz for GPS data in urban environments, which is well below a typical 3-meter error threshold. [0022] In the interest of clarity of explanation, embodiments are described herein in connection with UAVs. However, the embodiments are not limited as such. Instead, the embodiments described herein may similarly apply to other vehicles included manned aircraft, passenger cars, semi-trucks, waterborne vessels, and any other type of vehicle that includes some aspect of computer control and/or electronic signal sensing. [0023] Turning now to the figures, FIG.1 is an exemplary schematic diagram of a sensor pod system 100 of a vehicle according to some aspects of the present disclosure. The sensor pod system 100 can include one or more interface boards 102 that may each interface with one or more sensors 104. The interface boards 102 and the sensors 104 are depicted and described in further detail below with respect to FIG.3. The sensor pod system 100 can also include a central carrier board 106. A flight computer 108 (e.g., that controls flight of the vehicle housing the sensor pod system 100) and the interface boards 102 can each be coupled to the central carrier board 106. The central carrier board 106 is depicted and described in further detail below with respect to FIG.2. [0024] The sensors 104, which may in some examples be GPS sensors, can transmit temporal sensor data to the interface boards 102. The interface boards 102 can send the temporal data to the central carrier board 106. The central carrier board 106 can execute one or more computing processes, such as executing machine learning models, which can correct temporal data received from the sensors 104 via the interface board 102. The corrected temporal data can be transmitted to a data stream 110 of a flight computer 108 that controls navigation of the vehicle. The corrected temporal data can, for example, be used by the flight computer 108 to generate output signals that can be sent to a separate flight computer or directly to a process node or to a carrier board for additional processing. [0025] FIG.2 illustrates an exemplary schematic diagram of a central carrier board 106 in the sensor pod system according to some aspects of the present disclosure. In some examples, the central carrier board 106 may be expansible and can be connected to one or more sensor interface boards, such as the interface boards 102 of FIG.1. The central carrier board 106 can KILPATRICK TOWNSEND 788470151 include a core processor 202 that can execute one or more computing processes. For example, the core processor 202 may execute a first machine learning model 204a, which in some examples can be a multi-modal ellipsoidal correction algorithm (MEECA). The first machine learning model 204a may generate corrected temporal data. For example, the first machine learning model 204a may implement elliptical fitting and can handle multimodal sensory inputs (e.g., from multiple sensors connected to interface boards that are coupled to the central carrier board 106). The first machine learning model 204a may be capable of capturing and modeling complex relationships within dynamic, evolving environments. [0026] In some examples, the first machine learning model 204a may be used to train a second machine learning model 204b. The second machine learning model 204b may be a long short-term memory (LSTM) recurrent neural network (RNN) algorithm. The second machine learning model 204b may further correct the temporal data and in some examples may generate state estimation vectors. The output from the second machine learning model 204b (and/or the first machine learning model 204a), such as the corrected temporal stream and the state estimation vectors, can be transmitted as a data stream 110 to a flight computer, such as the flight computer 108 of FIG.1. The flight computer can use the corrected temporal stream and/or the state estimation vectors to control a vehicle that includes the sensor pod system (e.g., a UAV). [0027] In some examples, the central carrier board 106 can include a serial communications module 206, such as a controller area network (CAN) module with traffic management. The central carrier board 106 can include a nodal architecture in which traffic can have priority management. That is, the central carrier board 106 can dynamically choose which messages to receive and process first. The central carrier board 106 can include both analog and digital hardware that can perform analog and digital processes on the same board. For example, the central carrier board can include current digital-to-analog converters (DACs) 212, voltage DACs 214, current analog-to-digital converters (ADCs) 216, and voltage ADCs 218. The central carrier board 106 can include a control output bus 208 and a feedback input bus 210, each of which may be expansible. [0028] The core processor 202 can receive sensor parameters 220a-n from interface boards (e.g., sensors coupled to the interface boards) via the feedback input bus 210. The serial communications module 206 can control delivery of the sensor parameters 220a-n to the core processor 202. The serial communications module 206 may also control output of commands KILPATRICK TOWNSEND 788470151 from the core processor 202 to the interface boards via the control output bus 208. In this way, the core processor 202 can interface with the control output bus 208 and the feedback input bus 210. [0029] In some examples, the core processor 202 may perform one or more cleaning operations to clean the sensor parameters 220a-n. For example, signals received via the feedback input bus 210 may be processed with a low pass-filter and/or a high-pass filter to remove certain portions of the signal. In some examples, the core processor 202 may resample the signal in a spatial domain. Once cleaned, the core processor 202 can use the first machine learning model 204a and/or the second machine learning model 204b to fit an ellipsoid to a parameter 220 to analyze, separate, and measure the different erors of interest that may be associated with the parameter 220. In some examples, the errors may be observed in the spatial domain but can have a direct translation to the temporal domain (or vice versa). The core processor 202 can receive the errors and process the errors as well as temporal data therewith via the first machine learning model 204a to generate a corrected temporal stream of data for the sensor parameters 220a-n. In some examples, the corrected temporal stream of data and/or the error information can be used by the second machine learning model 204b to generate state estimation vectors (e.g., using a recurrent neural network (RNN)). In some examples, the second machine learning model 204b may also generate missing data from failed sensors using a generative neural network (GNN). THe output from the machine learning models 204a-b can be output as a data stream 110 for use by a flight computer (e.g., the flight computer 108 of FIG.1). In some examples, the corrected temporal stream of data may be a multimodal stream of data including data from multiple sensors. The elliptical fitting of the temporal stream of data performed by the first machine learning model 204a can identify relationships between data from different sensors. [0030] FIG.3 is a schematic diagram of an exemplary schematic diagram of one or more sensor interface boards 102 in the sensor pod system according to some aspects of the present disclosure. The interface boards 102 can include a microcontroller 302 and one or more sensors 104 that may be communicatively coupled to the interface boards 102. The microcontroller 302 and the sensor 104 may be housed on the same board. Additionally or alternatively, the microcontroller 302 may be coupled to the sensor 104 via an electrical interface, such as a screw terminal, a UF1 connector, or any other suitable electrical interface that can electrically connect the sensor 104 to the microcontroller 302. The interface board 102 may perform certain processing steps on sensor data received from the sensors 104. The KILPATRICK TOWNSEND 788470151 processing steps may include digital and/or analog processing steps and may be performed on the same board. The processing steps can include power management, preconditioning, filtering, signal sampling techniques, and any other suitable processing steps. The interface board 102 may in some examples perform additional spatial resampling operations in addition to spatial resampling operations that may be performed on a main board to which the interface board 102 may be communicatively coupled. [0031] In some examples, the microcontroller 302 can be coupled to a protection circuit 306 that may protect the microcontroller 302 from power surges that may be associated with the sensors 104. The microcontroller 302 may also be coupled to a monitoring circuit 304 that may house FPGAs or other processing devices that can be dynamically activated. The monitoring circuit 304 may generate and transmit correction commands to components on the interface board 102 or any other systems or boards that may be coupled to the interface board 102. For example, the commands may be transmitted to a central carrier board (e.g., the central carrier board 106 of FIGS.1-2) to cause certain corrective actions to be taken, such as to correct errors in a temporal data stream of sensor data from the sensors 104. The temporal data stream and/or the commands can be transmitted to the central carrier board via the feedback output bus 312 (e.g., to be received by the feedback input bus 210 of FIG.2). The microcontroller 302 may receive commands to perform certain actions via the control input bus 310 (e.g., from the control output bus 208 of the central carrier board 106 of FIG.2). In some examples, the data and/or commands transmitted or received by the interface board 102 via the control input bus 310 and/or the feedback output bus 312 can involve encryption. For example, the microcontroller 302 can encrypt data or commands and can decrypt encrypted communication received from the central carrier board (e.g., using standard data protocols). In some examples, the interface boards 102 can receive pins in a reversible configuration. The interface boards 102 may include a 64-pin configuration. [0032] FIG.4 depicts an exemplary architecture for an error correction system 403 that may be implemented on the central carrier board 106 according to some aspects of the present disclosure. The central carrier board 106 can be coupled to a GPS receiver 402 (e.g., via the interface board 102 of FIGS.1 and 3). The GPS receiver 402 may receive a GPS signal 404 from a GPS satellite 406. The GPS signal 404 can in some examples be a L1/C GPS signal. The central carrier board 106 may include a microcontroller 408, which can in some examples be the core processor 202 of FIG.2. The microcontroller 408 may include a GPS parser 410 that can parse the GPS signal 404 to generate raw GPS data 412. For KILPATRICK TOWNSEND 788470151 example, the GPS parser 410 may perform NMEA processing to parse and decode NMEA sentences from the GPS signal to transform the GPS signal 404 into physical values of raw GPS data 412. The GPS signal 404 and/or the raw GPS data 412 may be stored by the microcontroller 408 in local storage 414. The raw GPS data 412 can be plotted over time to create a temporal signal, which can be time series data corresponding to a measured sensor output or estimated physical variable. [0033] The raw GPS data 412 can be concatenated with historical GPS data 416 and provided as input to the error correction system 403. The error correction system 403 can, in some examples, include an environment detection model 418 that can be an environment classifier. The environment detection model 418 may classify the operational environment type, such as being an open field, semi-urban, urban, etc. The classification can inform the error correction system 403 which environment-specific model to apply for optimal sensor corrections (e.g., GPS signal corrections), which can increase accuracy and reliability in different conditions. By dynamically loading environment-specific models, the environment detection model 418 can reduce computational demands, making the operations of the central carrier board 106 suitable for real-time applications on low-power devices. [0034] The environment detection model 318 (also called . herein) can analyze a sliding window of sensor data (e.g., raw GPS data 412), called data window , by comparing the current raw GPS data 412 to historical patterns (e.g., historical GPS data 416) to ascertain sensor and measurement reliability. In some examples, the environment detection model 418 can be or can include a deep neural network (DNN) to be used as a classifier. Each instance of measurement in the raw GPS data 412 can be structured as a data frame . Multiple data frames can be combined to form , whose length can be controlled by the numerical parameter . Then, can be a 3D matrix where data frames are stacked from newest to oldest: , , … , where is the latest data frame and
Figure imgf000011_0001
is the oldest data frame, and is the length of the window. From each data frame, a different matrix can be constructed by the environment
Figure imgf000011_0002
KILPATRICK TOWNSEND 788470151 where is some threshold vector for each column derived from historical observations (e.g., the historical GPS data 416) and can be application specific. The environment detection model 418 can evaluate each GPS sensor stream individually and can assign first a reliability label to each sensor. In a particular example, can be set to , , … , . These threshold can be specific to each parameter and can be used to evaluate the reliability of the data frames. Each column of can be compared by the environment detection model 418 against the corresponding threshold in . A data frame can be considered reliable if most of its columns are below the threshold:
Figure imgf000012_0001
where 1 . can be an indicator function that returns 1 if the condition is true and 0 if otherwise. Then for each window : , , … ,
Figure imgf000012_0002
where can be a data window difference matrix that is constructed with , a tunable threshold parameter. In some examples, can be set at 0.5 . A data window may be deemed reliable if most of its frames are reliable. If 0, then the latest data frame can be discarded, and the next window can be re-evaluated. If consecutive data windows are unreliable, the process may be restarted. [0035] If the data window is reliable, it ca be used by the environment detection model 418 to generate an output , . By integrating data from all sensors coupled to the central carrier board 106 (e.g., via interface boards 102), the environment detection model 418 can classify the environment, preventing a single poorly performing sensor from misleading the error correction system 403. The environment detection model 418 can determine and filter which GPS data to pass to a machine learning model 428 in the error correction system 403. KILPATRICK TOWNSEND 788470151 [0036] The error correction system 403 can also include an inverse variance weighted filter 420 that can prioritize accurate data by dynamically weighting each GPS measurement based on its variance. This statistical approach can reduce the influence of noisy data, allowing the error correction system 403 to focus on more reliable inputs. Unlike traditional filtering techniques, such as low-pass filters (LPFs) or Kalman filters (KFs), which may involve system-specific tuning and external data, the inverse variance weighted filter 420 can operate without pre-configuration, making it more adaptable to urban environments. [0037] The inverse variance weighted filter 420 can employ a dynamically sized window of values denoted as characterized by a size . Within this window, the variance of sensor measurements for each parameter from a sensor (e.g., the raw GPS data 412 from the GPS receiver 402, or any other signal from a sensor coupled to the central carrier board 106) can be analyzed. The sensor measurements at time for all sensors can be represented as
Figure imgf000013_0001
where:
Figure imgf000013_0002
, , , … where parameters of an individual sensor (e.g., GPS receiver 402) at time , the measurements across all parameters are denoted in column vector as . Corrected sensor measurements (e.g., corrected GPS data 426) can be obtained by the inverse variance weighted filter 420 by adding a correction matrix to . Then for each parameter within sensor , the inverse variance weighted filter 420 can calculate the variance over the window . The weight for each parameter in each sensor can be inversely proportional to the variance:
Figure imgf000013_0003
where is some parameter-specific tunable gain parameter, initially set to 1, and is a small constant added to prevent division by zero. The deviation of the current measurement from its mean can be given by:
Figure imgf000013_0004
KILPATRICK TOWNSEND 788470151 is a correction element in , which is the correction matrix (e.g., applied corrections 424) that can be applied to the latest sensor data (e.g., corrected GPS data 426). [0038] The inverse variance weighted filter 420 can be scalable and adaptable to varying sensor counts and parameters. The inverse variance weighted filter 420 can detect and correct outliers early, improving data quality for a machine learning model 428 in the error correction system 403. By applying a transparent signal processing technique, the inverse variance weighted filter 420 can ensure verifiable inputs, enhance system reliability, and aid compliance with aerospace certification standards. The traceable data treatment performed by the inverse variance weighted filter 420 can boost transparency, which may be critical for regulatory adherence before applying further machine learning techniques (e.g., by the machine learning model 428). [0039] The machine learning model 428 can include or be an example of the first machine learning model 204a and/or the second machine learning model 204b of FIG. 2. The machine learning model 428 can be one of a set of machine learning models that was selected by the environment detection model 418 based on a classification of the environment (e.g., as detected from the raw GPS data 412). The machine learning model 428 can in some examples include a recurrent neural network (RNN) and a deep neural network (DNN). The machine learning model 428 can process inputs within a structured three-dimensional matrix, denoted as , which can encapsulate multi-sensor time-series data (e.g., including the corrected GPS data 426 from the GPS receiver 402 and any other sensor data from sensors coupled to the central carrier board 106). The matrix can be defined as:
Figure imgf000014_0001
where represents the number of sensors (e.g., GPS sensors or other types of sensors) coupled to the central carrier board 106, signifies the number of monitored parameters per sensor, and is the dynamically tunable window size corresponding to the number of sequential data points considered for each parameter. In a particular example, a starting value of 25 can be predicated on empirical observations of data indicating that a 2.5 second window can typically include sufficient data to generate counteracting error signals in an KILPATRICK TOWNSEND 788470151 urban environment. This can allow the machine learning model 428 to capture both immediate and delayed error effects while reducing memory footprint. [0040] The input to the machine learning model 428, normalized based on individual predictor characteristics, can be flattened to a 2D matrix , by . and fed to an LSTM of the machine learning model 428. The LSTM,
Figure imgf000015_0001
, can predict altitude, angular, and offset errors.
Figure imgf000015_0002
where can be identified temporal GPS error at the instant of measurement. The LSTM, which can be a specialized form of RNN, can capture the temporal dynamics of the GPS errors. [0041] In conjunction with the LSTM, the machine learning model 428 can include a DNN model that can composite the error output from the LSTM and handle non-sequential errors. Subsequently, these predictions can be processed through a DNN model,
Figure imgf000015_0003
that can identify the instantaneous aperiodic and DC errors in GPS, , as shown below:
Figure imgf000015_0004
where are a set of parameters that can include the associated weights for the input, connected, and output layers of the model, along with any associated bias values as learned by the model during the training stage. The output of the machine learning model 428 can be estimated error 430 which can include a corrected temporal stream of the GPS signal 404 (or any other sensor data received from sensors coupled to the central carrier board 106). The estimated error 430 can be output to a flight computer (e.g., the flight computer 108 of FIG. 1), which can use the estimated error 430 and the corrected temporal stream to accurately navigate a vehicle that includes the central carrier board 106. [0042] FIG.5 shows an exemplary GPS system architecture 500 for the sensor pod system according to some aspects of the present disclosure. The GPS system architecture 500 can include a first microcontroller 302a (e.g., of a first interface board) coupled to multiple (e.g., three) GPS sensors 502. Similarly, a second microcontroller 302b (e.g., of a second interface KILPATRICK TOWNSEND 788470151 board) can be coupled to multiple (e.g., three) GPS sensors 502. The microcontrollers 302a-b can interface with a microprocessor 504 of a central carrier board. The microprocessor 504 can be an example of the core processor 202 of FIG.2. The microprocessor 504 may process simultaneous readings from some or all of the six GPS sensors 502. Examples of the microcontrollers 302a-b can include the ATmega2560 microchip. Examples of the GPS sensors 502 can include the MTK3339 PA6H GPS module. [0043] FIG.6 is an exemplary circuit diagram 600 illustrating a wiring configuration for an interface board 102 according to some aspects of the present disclosure. The wiring configuration shown in the circuit diagram 600 can include five sensors 602 connected to a microcontroller 302 of the interface board 102, but in other examples any number of sensors 602 may be included and connected. In some examples, the microcontroller 302 can be a Portenta H7 microprocessor. In another example, the microcontroller 302 can be an Arduino MEGA 2560 microprocessor. [0044] The circuit diagram 600 can be powered by a battery power supply 604, which in some examples can include three 9V batteries, four 9V batteries, or any number or voltage of batteries that are placed in parallel. The battery power supply 604 may in some examples include rechargeable batteries. The circuit diagram 600 can additionally include a ground 606 connected to the battery power supply 604. The circuit diagram 600 can, in some examples, include an SD card 608 or other form of non-volatile memory for storage of sensor data. In some examples, some or all of the sensors 602 can be GPS sensors that can include internal passive antennas for collecting GPS sensor data. The sensors 602 can be wireless GPS receivers. In some examples, the circuit diagram 600 may additionally include an antenna system to communicate information (e.g., sensor data from the sensors 602) over Wi-Fi, Bluetooth, or any other suitable network. [0045] FIG.7 shows exemplary GPS data 700 that may be processed by a sensor pod system according to some aspects of the present disclosure. Unprocessed sensor values, as reported by a GPS sensor, may have significant errors, particularly in environments such as urban environments. The sensor values reported by GPS sensors may vary significantly from the actual position 702. By implementing the techniques described herein via the machine learning processing performed by a central carrier board of the sensor pod system, error corrected values 704 can be determined that are significantly closer to the actual position 702 KILPATRICK TOWNSEND 788470151 compared to the original, unprocessed sensor values reported by GPS sensors on the sensor pod system. [0046] FIG.8 depicts an exemplary front face 802 and back face 804 of a carrier board in a sensor pod system according to some aspects of the present disclosure, such as the central carrier board 106 of FIGS.1, 2, and 4. The front face 802 can include an adaptor 806 that can couple to a flight computer, such as the flight computer 108 of FIG.1. Any flight computer 108 may be coupled to the carrier board with a suitable adaptor. On either side of the adaptor 806, the front face 802 of the carrier board can include slots 808 for interface boards (e.g., the interface boards 102 of FIGS.1 and 3). The carrier board may support any number or type of sensors via the interface boards coupled via the slots 808. The back face 804 of the carrier board can include power sources 810 that can power components of the carrier board. In some examples, five power sources 810 can provide up to 80 watts of power to the carrier board. In another example, eight power sources 810 can provide up to 110 watts of power to the carrier board. Thus, relatively high levels of power can be utilized compared to conventional systems for UAVs. [0047] FIG.9 depicts exemplary interface boards 902a-b of a sensor pod system according to some aspects of the present disclosure. Interface boards 902a-b can be examples of the interface boards 102 of FIGS.1, 3, and 6. The interface boards 902a-b can include buses 904a-b that can be examples of the control input bus 310 and/or the feedback output bus 312 of FIG.3. In some examples, the buses 904a-b may also provide power to the interface boards 902a-b. The buses 904a-b can couple to a central carrier board 106, such as to the slots 808 of FIG.8. Sensors 906a-b can couple to the interface boards 902a-b. For example, the sensors 906a-b may be directly mounted to the interface boards 902a-b themselves, or may have a wired connection to the interface boards 902a-b. The interface boards 902a-b can each include a microcontroller 908a-b that can control the flow of data (e.g., from the sensors 906a-b) to a carrier board (e.g., via the buses 904a-b). [0048] FIG.10 illustrates an exemplary flow of a process 1000 for correcting a temporal data stream of a sensor pod system for a vehicle according to some aspects of the present disclosure. The process 1000 of FIG.10, and any other processes described herein are illustrated as logical flow diagrams, the operation of which represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof. In the context of computer instructions, the operations may represent computer- KILPATRICK TOWNSEND 788470151 executable instructions stored on one or more non-transitory computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures and the like that perform particular functions or implement particular data types. Some or all of the process 1000 can be performed by any suitable combination of hardware and/or software, such as by any components of the sensor pod system 100 (e.g., flight computer 108, central carrier board 106, interface board 102, sensors 104), core processor 202, microcontroller 302, microcontroller 408, microprocessor 504, microcontrollers 908a-b, or computer system 1100. [0049] Block 1002 of the process 1000 involves a computer system receiving from one or more interface boards in a sensor pod system, a temporal data stream. The computer system can in some examples be housed on a central carrier board of the sensor pod system. The one or more interface boards can be separate from the central carrier board and coupled to the central carrier board. Sensor data from one or more sensors coupled to the one or more interface boards can be received by the interface boards. The interface board can process the sensor data to define sensor parameters. These sensor parameters can be provided to a feedback bus that can provide the sensor parameters as a temporal data stream to the central carrier board. [0050] In some examples, the one or more interface boards can additionally include a control input bus that can receive inputs from the carrier board and a microcontroller device communicatively coupled to the control input bus, the feedback bus, and the one or more sensors. The microcontroller device can receive and process the sensor data and can provide the sensor parameters to the feedback bus for delivery to the carrier board. In some examples, the one or more sensors and the microcontroller device can be housed on a single board. In other examples, the control input bus, the feedback bus, and the microcontroller device can be housed on a board (e.g., the one or more interface boards), and the one or more sensors can be separate from the board. In some examples, the carrier board can include a feedback bus that can receive inputs from the one or more interface boards. The carrier board can also include a control output bus that can provide outputs (e.g., commands) to the one or more interface boards. [0051] Block 1004 of the process 1000 involves the computer system determining, based on the temporal data stream, one or more error types associated with one or more errors KILPATRICK TOWNSEND 788470151 associated with the temporal data stream. The one or more error types can be determined via an error correction system. The error correction system may also generate the corrected temporal error stream based on the identified error types. In some examples, the error correction system can be used to train an onboard machine learning model based on the corrected temporal data. The onboard machine learning model can in some examples be a recurrent neural network. The trained onboard machine learning model can generate one or more state estimation vectors (e.g., orientation, velocity, or other state information) of the sensor pod system based on the corrected temporal data stream. [0052] Block 1006 of the process 1000 involves the computer system generating, based on the temporal data stream and the one or more error types, a corrected temporal data stream that corrects the one or more errors. In some examples, the onboard machine learning model may be a generative neural network (GNN). The onboard machine learning model may determine, based on the one or more error types, missing data from a sensor failure in the temporal data stream. The onboard machine learning model can be used to generate, based on the temporal data stream and the one or more error types, the corrected temporal data stream that includes the missing data. In some examples, the corrected temporal data stream can be transmitted toa flight computer. The flight computer can use the corrected temporal data stream to control a vehicle that includes the sensor pod system. The vehicle can, in some examples, be a UAV. In some examples, the flight computer may also receive the one or more state estimation vectors and may control the vehicle based on the one or more state estimation vectors. [0053] FIG.11 depicts example components of a computer system 1100 in which embodiments of the present disclosure can be performed. The computer system 1100 can be used as a node in a computer network, where this node provides one or more computing components of an underlay network of the computer network and/or one or more computing components of an overlay network of the computer network. Additionally or alternatively, the components of the computer system 1100 can be used in an endpoint. Although the components of the computer system 1100 are illustrated as belonging to a same system, the computer system 1100 can also be distributed (e.g., between multiple user devices). The computer system 1100 can be an example of components of the sensor pod system 100 (e.g., flight computer 108, central carrier board 106, interface board 102, sensors 104), core processor 202, microcontroller 302, microcontroller 408, microprocessor 504, microcontrollers 908a-b, or computer system 1100. KILPATRICK TOWNSEND 788470151 [0054] The computer system 1100 can include at least a processor 1102, a memory 1104, a storage device 1106, input/output peripherals (I/O) 1108, communication peripherals 1110, and an interface bus 1112. The interface bus 1112 is configured to communicate, transmit, and transfer data, controls, and commands among the various components of the computer system 1100. The memory 1104 and the storage device 1106 include computer-readable storage media, such as RAM, ROM, electrically erasable programmable read-only memory (EEPROM), hard drives, CD-ROMs, optical storage devices, magnetic storage devices, electronic non-volatile computer storage; for example, Flash® memory, and other tangible storage media. Any of such computer-readable storage media can be configured to store instructions or program codes embodying aspects of the disclosure. The memory 1104 and the storage device 1106 also include computer-readable signal media. A computer-readable signal medium includes a propagated data signal with computer-readable program code embodied therein. Such a propagated signal takes any of a variety of forms including, but not limited to, electromagnetic, optical, or any combination thereof. A computer-readable signal medium includes any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use in connection with the computer system 1100. [0055] Further, the memory 1104 includes an operating system, programs, and applications. The processor 1102 is configured to execute the stored instructions and includes, for example, a logical processing unit, a microprocessor, a digital signal processor, and other processors. The memory 1104 and/or the processor 1102 can be virtualized and can be hosted within another computer system of, for example, a cloud network or a data center. The I/O peripherals 1108 include user interfaces, such as a keyboard, screen (e.g., a touch screen), microphone, speaker, other input/output devices, and computing components, such as graphical processing units, serial ports, parallel ports, universal serial buses, and other input/output peripherals. The I/O peripherals 1108 are connected to the processor 1102 through any of the ports coupled to the interface bus 1112. The communication peripherals 1110 are configured to facilitate communication between the computer system 1100 and other systems over a communications network and include, for example, a network interface controller, modem, wireless and wired interface cards, antenna, and other communication peripherals. [0056] The computer system 1100 can include a variety of data stores and other memory and storage media as discussed above. These can reside in a variety of locations, such as on a KILPATRICK TOWNSEND 788470151 storage medium local to (and/or resident in) one or more of the computers or remote from any or all of the computers across the network. In a particular set of embodiments, the information may reside in a storage-area network (“SAN”) familiar to those skilled in the art. Similarly, any necessary files for performing the functions attributed to the computers, servers, or other network devices may be stored locally and/or remotely, as appropriate. Where a system includes computerized devices, each such device can include hardware elements that may be electrically coupled via a bus, the elements including, for example, at least one central processing unit (“CPU”), at least one input device (e.g., a mouse, keyboard, controller, touch screen, or keypad), and at least one output device (e.g., a display device, printer, or speaker). Such a system may also include one or more storage devices, such as disk drives, optical storage devices, and solid-state storage devices, such as random-access memory (“RAM”) or read-only memory (“ROM”), as well as removable media devices, memory cards, and/or flash cards. [0057] Such devices also can include a computer-readable storage media reader, a communications device (e.g., a modem, a network card (wireless or wired)), an infrared communication device, etc.), and working memory as described above. The computer- readable storage media reader can be connected with, or configured to receive, a computer- readable storage medium, representing remote, local, fixed, and/or removable storage devices as well as storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information. The system and various devices also typically will include a number of software applications, modules, services, or other elements located within at least one working memory device, including an operating system and application programs, such as a client application or Web browser. It should be appreciated that alternate embodiments may have numerous variations from that described above. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed. [0058] Storage media computer readable media for containing code, or portions of code, can include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program KILPATRICK TOWNSEND 788470151 modules, or other data, including RAM, ROM, Electrically Erasable Programmable Read- Only Memory (“EEPROM”), flash memory or other memory technology, Compact Disc Read-Only Memory (“CD-ROM”), digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a system device. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the various embodiments. [0059] While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations, and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art. Indeed, the methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the methods and systems described herein may be made without departing from the spirit of the present disclosure. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the present disclosure. [0060] Unless specifically stated otherwise, it is appreciated that throughout this specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” and “identifying” or the like refer to actions or processes of a computing device, such as one or more computers or a similar electronic computing device or devices, that manipulate or transform data represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the computing platform. [0061] The system or systems discussed herein are not limited to any particular hardware architecture or configuration. A computing device can include any suitable arrangement of components that provide a result conditioned on one or more inputs. Suitable computing devices include multipurpose microprocessor-based computing systems accessing stored software that programs or configures the computing system from a general purpose KILPATRICK TOWNSEND 788470151 computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device. [0062] Embodiments of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the examples above can be varied—for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel. [0063] Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain examples include, while other examples do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more examples or that one or more examples necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular example. [0064] Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood within the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain examples require at least one of X, at least one of Y, or at least one of Z to each be present. [0065] Use herein of the word “or” is intended to cover inclusive and exclusive OR conditions. In other words, A or B or C includes any or all of the following alternative combinations as appropriate for a particular usage: A alone; B alone; C alone; A and B only; A and C only; B and C only; and all three of A and B and C. [0066] The use of the terms “a” and “an” and “the” and similar referents in the context of describing the disclosed examples (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its KILPATRICK TOWNSEND 788470151 inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. The term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Similarly, the use of “based at least in part on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based at least in part on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting. [0067] The various features and processes described above may be used independently of one another or may be combined in various ways. All possible combinations and sub- combinations are intended to fall within the scope of the present disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed examples. Similarly, the example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed examples. [0068] All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein. KILPATRICK TOWNSEND 788470151 [0069] Certain processes are described and claimed herein. The operation of each block represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof. In the context of computer instructions, the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be omitted or combined in any order and/or in parallel to implement the processes. [0070] Some or all of the processed described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof. The code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors. The computer-readable storage medium may be non-transitory. KILPATRICK TOWNSEND 788470151

Claims

WHAT IS CLAIMED IS: 1. A carrier board for use in a sensor pod system, comprising: a feedback bus configured to receive inputs from a plurality of interface boards; a control output bus configured to provide outputs to the plurality of interface boards; and a processing device configured to: control a serial communications module configured to manage operation of the feedback bus and the control output bus; and implement an error correction system configured to: analyze sensor parameters received over the feedback bus; and generate analyzed output comprising error corrections for the sensor parameters for outputting by the control output bus.
2. The carrier board of claim 1, wherein the processing device is further configured to: implement the error correction system to determine, based on the analyzed sensor parameters, one or more error types associated with one or more errors with the inputs from the plurality of interface boards; implement the error correction system to generate, based on the analyzed sensor parameters and the one or more error types, a corrected temporal data stream that corrects the one or more errors; and transmit, to a flight computer coupled to the sensor pod system, the analyzed output comprising the corrected temporal data stream, wherein the corrected temporal data stream is usable by the flight computer to control a vehicle comprising the sensor pod system.
3. The carrier board of claim 2, wherein the processing device is further configured to: implement an onboard machine learning model to generate one or more state estimation vectors based on the corrected temporal data stream; and transmit, to the flight computer, the analyzed output comprising the one or more state estimation vectors, wherein the one or more state estimation vectors are usable by the flight computer to control a vehicle comprising the sensor pod system. KILPATRICK TOWNSEND 788470151
4. The carrier board of claim 3, wherein the onboard machine learning model comprises a recurrent neural network.
5. The carrier board of claim 1, wherein the processing device is configured to interface with the feedback bus and the control output bus.
6. The carrier board of claim 1, wherein the processing device is further configured to: determine, based on the sensor parameters, missing data from a sensor failure; and use the error correction system to generate the missing data.
7. A sensor pod system comprising: one or more interface boards, each interface board being configured to couple to one or more sensors; and a carrier board comprising: a processing device; and a memory having instructions that are executable by the processing device for causing the processing device to: receive, from the one or more interface boards, a temporal data stream; determine, based on the temporal data stream, one or more error types associated with one or more errors associated with the temporal data stream; and generate, based on the temporal data stream and the one or more error types, a corrected temporal data stream that corrects the one or more errors.
8. The sensor pod system of claim 7, wherein the memory further comprises instructions that are executable by the processing device for causing the processing device to: transmit the corrected temporal data stream to a flight computer, wherein the corrected temporal data stream is usable by the flight computer to control a vehicle comprising the sensor pod system. KILPATRICK TOWNSEND 788470151
9. The sensor pod system of claim 7, wherein the memory further comprises instructions that are executable by the processing device for causing the processing device to: determine the one or more error types and generate the corrected temporal data stream via an error correction system; and train, using the error correction system, an onboard machine learning model based on the corrected temporal data stream.
10. The sensor pod system of claim 9, wherein the memory further comprises instructions that are executable by the processing device for causing the processing device to: implement the trained onboard machine learning model to generate one or more state estimation vectors based on the corrected temporal data stream; and transmit the one or more state estimation vectors to a flight computer, wherein the one or more state estimation vectors are usable by the flight computer to control a vehicle comprising the sensor pod system.
11. The sensor pod system of claim 7, wherein the carrier board further comprises: a feedback bus configured to receive inputs from the one or more interface boards; and a control output bus configured to provide outputs to the one or more interface boards.
12. The sensor pod system of claim 11, wherein each interface board of the one or more interface boards comprises: a control input bus configured to receive inputs from the carrier board; a feedback bus configured to provide outputs to the carrier board; and a microcontroller device communicatively coupled to the control input bus, the feedback bus, and the one or more sensors, and configured to: receive sensor data from the one or more sensors; process the sensor data to define sensor parameters; and provide the sensor parameters to the feedback bus for delivery to the carrier board. KILPATRICK TOWNSEND 788470151
13. The sensor pod system of claim 12, wherein the one or more sensors and the microcontroller device are housed on a single board.
14. The sensor pod system of claim 12, wherein the control input bus, the feedback bus, and the microcontroller device are housed on a board, and wherein the one or more sensors are separate from the board.
15. A computer-implemented method comprising: receiving, from one or more interface boards in a sensor pod system, a temporal data stream; determining, based on the temporal data stream, one or more error types associated with one or more errors associated with the temporal data stream; and generating, based on the temporal data stream and the one or more error types, a corrected temporal data stream that corrects the one or more errors.
16. The computer-implemented method of claim 15, further comprising: transmitting the corrected temporal data stream to a flight computer, wherein the corrected temporal data stream is usable by the flight computer to control a vehicle comprising the sensor pod system.
17. The computer-implemented method of claim 15, further comprising: determining the one or more error types and generating the corrected temporal data stream via an error correction system; and training, using the error correction system, an onboard machine learning model based on the corrected temporal data stream.
18. The computer-implemented method of claim 17, further comprising: implementing the trained onboard machine learning model to generate one or more state estimation vectors based on the corrected temporal data stream; and transmitting the one or more state estimation vectors to a flight computer, the one or more state estimation vectors being usable by the flight computer to control a vehicle comprising the sensor pod system. method of claim 15, further comprising:
Figure imgf000029_0001
KILPATRICK TOWNSEND 788470151 determining, based on the one or more error types, missing data from a sensor failure in the temporal data stream; and generating, based on the temporal data stream and the one or more error types, the corrected temporal data stream that includes the missing data. 20. The computer-implemented method of claim 15, wherein the one or more interface boards comprise a microcontroller device configured to: receive sensor data from one or more sensors coupled to the one or more interface boards; process the sensor data to define sensor parameters; and provide the sensor parameters for delivery to a feedback bus configured to provide the sensor parameters as the temporal data stream. KILPATRICK TOWNSEND 788470151
PCT/US2024/057100 2023-11-27 2024-11-22 Advanced data acquisition and multimodal error correction for enhanced aerospace safety and navigation Pending WO2025117377A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363603099P 2023-11-27 2023-11-27
US63/603,099 2023-11-27

Publications (1)

Publication Number Publication Date
WO2025117377A1 true WO2025117377A1 (en) 2025-06-05

Family

ID=95897837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/057100 Pending WO2025117377A1 (en) 2023-11-27 2024-11-22 Advanced data acquisition and multimodal error correction for enhanced aerospace safety and navigation

Country Status (1)

Country Link
WO (1) WO2025117377A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190385339A1 (en) * 2015-05-23 2019-12-19 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
US20200201312A1 (en) * 2018-12-21 2020-06-25 The Boeing Company Sensor fault detection and identification using residual failure pattern recognition
US20200319000A1 (en) * 2019-04-08 2020-10-08 Endress+Hauser Conducta Gmbh+Co. Kg Method for correcting measurement data of an analysis sensor and analysis sensor with correction of measurement data
US20200401136A1 (en) * 2017-03-23 2020-12-24 Tesla, Inc. Data synthesis for autonomous control systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190385339A1 (en) * 2015-05-23 2019-12-19 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
US20200401136A1 (en) * 2017-03-23 2020-12-24 Tesla, Inc. Data synthesis for autonomous control systems
US20200201312A1 (en) * 2018-12-21 2020-06-25 The Boeing Company Sensor fault detection and identification using residual failure pattern recognition
US20200319000A1 (en) * 2019-04-08 2020-10-08 Endress+Hauser Conducta Gmbh+Co. Kg Method for correcting measurement data of an analysis sensor and analysis sensor with correction of measurement data

Similar Documents

Publication Publication Date Title
US12165531B1 (en) Fiducial-based navigation of unmanned vehicles
US10565732B2 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158417B1 (en) Sensor fusion using inertial and image sensors
Schmid et al. Autonomous vision‐based micro air vehicle for indoor and outdoor navigation
CN109581426B (en) A method, system, device and storage medium for identifying abnormal GNSS signals
EP3734394A1 (en) Sensor fusion using inertial and image sensors
Chambers et al. Robust multi-sensor fusion for micro aerial vehicle navigation in GPS-degraded/denied environments
US20190139422A1 (en) Companion drone to assist location determination
Rhudy et al. Fusion of GPS and redundant IMU data for attitude estimation
Er et al. Development control and navigation of Octocopter
US20250052909A1 (en) Using Magnetic-Sensor Data to Correct for INS Drift
CN117629187A (en) Tightly coupled odometry method and system based on UWB and vision fusion
CN103196453A (en) Design of four-axis aircraft visual navigation system
CN109521785A (en) It is a kind of to clap Smart Rotor aerocraft system with oneself
KR20170127585A (en) Method and System for Collecting and Monitoring Data for Ocean Observation Buoys
WO2025117377A1 (en) Advanced data acquisition and multimodal error correction for enhanced aerospace safety and navigation
Shaik et al. Development and validation of embedded system architecture for shallow-water based H-AUV
Batista et al. GAS tightly coupled LBL/USBL position and velocity filter for underwater vehicles
Emran et al. Hybrid low-cost approach for quadrotor attitude estimation
KR102061855B1 (en) Apparatus and method for estimating pseudo attitude of flight model
WO2024059347A1 (en) Verifying flight system calibration and performing automated navigation actions
CN117782091A (en) An inertial navigation load virtualization method for unmanned boat control system
CN205787901U (en) A kind of UAV Flight Control System
CN111142128A (en) Navigation integrity monitoring method and device and unmanned aerial vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24898555

Country of ref document: EP

Kind code of ref document: A1