[go: up one dir, main page]

CN111121815A - Path display method and system based on AR-HUD navigation and computer storage medium - Google Patents

Path display method and system based on AR-HUD navigation and computer storage medium Download PDF

Info

Publication number
CN111121815A
CN111121815A CN201911378417.0A CN201911378417A CN111121815A CN 111121815 A CN111121815 A CN 111121815A CN 201911378417 A CN201911378417 A CN 201911378417A CN 111121815 A CN111121815 A CN 111121815A
Authority
CN
China
Prior art keywords
navigation
path
data
points
hud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911378417.0A
Other languages
Chinese (zh)
Other versions
CN111121815B (en
Inventor
孙欣然
李万超
旷璨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Lilong Zhongbao Intelligent Technology Co ltd
Original Assignee
Chongqing Lilong Technology Industry Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Lilong Technology Industry Group Co Ltd filed Critical Chongqing Lilong Technology Industry Group Co Ltd
Priority to CN201911378417.0A priority Critical patent/CN111121815B/en
Publication of CN111121815A publication Critical patent/CN111121815A/en
Application granted granted Critical
Publication of CN111121815B publication Critical patent/CN111121815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • G01C21/3694Output thereof on a road map
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Navigation (AREA)

Abstract

本发明公开了一种基于AR‑HUD导航的路径显示方法、系统及计算机存储介质,属于车辆导航技术领域,包括步骤:S1,获取导航路径的导航数据,所述导航数据包括距离偏移量、对应的坡度和转度;S2,根据所述导航数据,生成导航路径的关键点,并计算所述关键点在车身坐标下的三维位置;S3,根据所述三维位置绘制导航箭头的三维图形,并将所述三维图形投射到抬头显示器。本发明克服现有技术中不能根据坡度和转度实时生成导航轨迹,导航箭头与实际路况不贴合的技术问题,本发明所提供的一种基于AR‑HUD导航的路径显示方法、系统及计算机存储介质,能够以多个导航箭头图标动态显示于实际路况贴合的导航路径,使得显示更直观。

Figure 201911378417

The invention discloses a path display method, system and computer storage medium based on AR-HUD navigation, belonging to the technical field of vehicle navigation. Corresponding gradient and rotation; S2, according to the navigation data, generate key points of the navigation path, and calculate the three-dimensional position of the key points in the vehicle body coordinates; S3, draw the three-dimensional graphics of the navigation arrow according to the three-dimensional position, and project the three-dimensional graphics to the head-up display. The present invention overcomes the technical problems in the prior art that the navigation trajectory cannot be generated in real time according to the gradient and the rotation, and the navigation arrow does not fit the actual road conditions. The present invention provides a path display method, system and computer based on AR-HUD navigation. The storage medium can dynamically display a plurality of navigation arrow icons on the navigation path that fits the actual road conditions, so that the display is more intuitive.

Figure 201911378417

Description

Path display method and system based on AR-HUD navigation and computer storage medium
Technical Field
The invention relates to the technical field of vehicle navigation, in particular to a path display method and system based on AR-HUD navigation and a computer storage medium.
Background
The continuous development of automobile technology makes people and automobiles more and more closely connected, and the requirements on various functions of automobiles are also more and more increased. In the existing vehicle navigation technology, navigation is mostly performed through an application program installed in a vehicle-mounted system. In another method, scene images shot by the vehicle in real time are collected, and corresponding arrows are marked on the collected images to indicate roads where the vehicle needs to move ahead, but in this way, navigation information of a screen of a vehicle watching machine of the driver is needed, the attention of the driver is easily dispersed, and driving is unsafe.
In the prior art, arrow graphics are being projected on the front windshield stepwise by AR-HUD technology to represent the navigation direction of traffic. However, the arrow generated in this way cannot generate a navigation track in real time according to the gradient and the rotation degree, so that the navigation arrow cannot be attached to the actual road condition. The calculation cannot be performed on the paths with different grades and rotation degrees, and the calculation of the whole path is complex and time-consuming.
Disclosure of Invention
The invention aims to solve the technical problems that a navigation track can not be generated in real time according to gradient and rotation degree and a navigation arrow is not attached to an actual road condition in the prior art.
In order to achieve the above purpose, the invention provides the following technical scheme:
on one hand, the invention also provides a path display method based on AR-HUD navigation, which specifically comprises the following steps: s1, acquiring navigation data of the navigation path, wherein the navigation data comprises distance offset, corresponding gradient and rotation degree; s2, generating key points of a navigation path according to the navigation data, and calculating the three-dimensional positions of the key points under the vehicle body coordinates; and S3, drawing a three-dimensional graph of the navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to a head-up display.
Further, the step of S2 specifically includes: s21, generating key points of the navigation path according to the navigation data, and calculating the key point index to be drawn; s22, counting skipped points in two adjacent key points to generate non-key point data; and S23, calculating the three-dimensional coordinates of the key points according to the non-key point data.
Preferably, the step of S21 specifically includes: s211, calculating the number of path points according to the navigation path and the key points; s212, calculating the distance between the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows; and S213, generating a path reference point and a key point index according to the number of the path points and the distance of the navigation arrow.
Preferably, the step of S23 specifically includes: s231, generating a three-dimensional coordinate of the previous key point according to the navigation data of the previous key point; s232, calculating the three-dimensional coordinates of the key points according to the three-dimensional coordinates of the previous key point and the non-key point data.
Further, the method further includes S4: and constructing a view matrix under the vehicle body coordinates, monitoring the movement track of human eyes, and updating the view matrix according to the movement track of the human eyes.
Further, the step of S4 specifically includes: s41, constructing a view matrix under the vehicle body coordinates through a vehicle-mounted camera; s42, collecting the pupil movement distance of the user to generate the eye movement track; and S43, updating the view matrix according to the human eye motion trail.
In another aspect, the present invention further provides a route display system based on AR-HUD navigation, the system including the following modules: the data acquisition module is used for acquiring distance offset of the navigation path and navigation data of corresponding gradient and rotation degree; the position calculation module is used for generating key points of a navigation path according to the navigation data and calculating the three-dimensional positions of the key points under the vehicle body coordinates; and the graph generation module is used for drawing a three-dimensional graph of a navigation arrow according to the three-dimensional position and projecting the three-dimensional graph to the head-up display.
Preferably, the position calculation module includes the following units: the index calculating unit is used for calculating the number of the drawn key points and the index of the key points required to be drawn; the data counting unit is used for counting skipped points in two adjacent key points to generate non-key point data; and the coordinate generating unit is used for calculating the three-dimensional coordinates of the key points according to the non-key point data.
Further, the system further comprises an eye movement updating module, wherein the eye movement updating module is used for constructing a view matrix under the vehicle body coordinates, monitoring the movement track of human eyes and updating the view matrix according to the movement track of the human eyes; the eye movement updating module specifically comprises the following units: the matrix construction unit is used for constructing a view matrix under the vehicle body coordinates through a vehicle-mounted camera; the eye movement detection unit is used for collecting the pupil movement distance of the user and generating an eye movement track; and the view updating unit is used for updating the view matrix according to the human eye motion track.
Meanwhile, the invention also provides a computer storage medium, which stores a computer program, and the computer program realizes the steps of the AR-HUD navigation-based path display method when being executed by a processor.
Compared with the prior art, the invention has the beneficial effects that:
according to the method and the system for displaying the path based on AR-HUD navigation, provided by the invention, the technical problems that the navigation track cannot be generated in real time according to the gradient and the rotation degree and the navigation arrow is not attached to the actual road condition in the prior art are solved by acquiring and calculating the navigation data, a plurality of navigation arrow icons are dynamically displayed on the navigation path attached to the actual road condition, so that the display is more visual, when a plurality of navigation icons are used for drawing the dynamic path of a complex road, only the posture information of the number of the used navigation icons needs to be calculated, and the coordinate of the whole path does not need to be calculated; meanwhile, the method for calculating the navigation data can solve the technical problems that a plurality of sections of paths with different gradients and rotation degrees cannot be calculated and the calculation of the whole path is time-consuming in the prior art, can simulate a complex navigation path with less resource consumption and improve the running speed; the invention also can solve the technical problem that the image can not change along with the change of the visual angle position of the human eye in the prior art by collecting the human eye movement track of the driver and updating the view in real time, can calculate the navigation path according to the real visual angle of the driver and improve the navigation accuracy.
Drawings
FIG. 1 is a schematic flow chart of a method for displaying a route based on AR-HUD navigation according to the present invention;
FIG. 2 is another schematic flow chart of a method for displaying a route based on AR-HUD navigation according to the present invention;
FIG. 3 is a schematic structural diagram of a route display system based on AR-HUD navigation according to the present invention;
FIG. 4 is a schematic diagram of another structure of a route display system based on AR-HUD navigation according to the present invention;
FIG. 5 is an effect diagram of a method and system for displaying a route based on AR-HUD navigation according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and embodiments. It should be understood that the scope of the above-described subject matter is not limited to the following examples, and any techniques implemented based on the disclosure of the present invention are within the scope of the present invention.
The invention relates to a path display method, a system and a computer storage medium based on AR-HUD navigation, which combine with AR-HUD projection technology to accurately project a virtual image to a real environment by an AR-HUD projection complete machine module, so that a driver can drive better, and the specific implementation mode is as follows:
first embodiment
Fig. 1 is a flowchart illustrating a path display method based on AR-HUD navigation according to an exemplary embodiment.
Referring to fig. 1, a route display method based on AR-HUD navigation in this embodiment includes the following steps:
step S1, navigation data of the navigation route, and navigation data of the distance offset, the corresponding gradient, and the degree of rotation are acquired.
The step is to acquire the navigation data of the road, so that the navigation data of the next step can be conveniently processed. The navigation data is obtained from existing navigation applications, such as common google maps, high-grade maps, and Baidu maps, which are conventional navigation applications. The obtained navigation data comprises distance offset and corresponding gradient and turning degree, and the obtained navigation data is prepared for solving the technical problems that a navigation track cannot be generated in real time according to the gradient and the turning degree and a navigation arrow does not coincide with an actual road condition in the prior art.
And step S2, generating key points of the navigation path according to the navigation data, and calculating the three-dimensional positions of the key points under the vehicle body coordinates.
The step is to select key points of a navigation path, for a path needing navigation, navigation information such as straight going, right turning, uphill, entering a roundabout and the like exists, and the navigation information has directionality, and points of the vehicle for changing the moving direction are determined to be the key points. The method solves the problem that when a plurality of navigation icons are used for drawing a complex dynamic path, the complex navigation path can be simulated with less resource consumption by only calculating the information of the number of the used navigation icons rather than calculating the coordinates of the whole path, and comprises the following specific operation steps:
step S21, calculating the number of path points, path reference points and the index of the key points to be drawn according to the key points. In this step, in order to obtain the index of the key points to be drawn, a navigation arrow which can navigate the point is required before each key point, for example, when on a multi-lane, there are signposts such as turn around, turn right, and go straight on the guidepost, and the arrows are drawn on the path points. The specific operation steps are as follows:
step S211, calculating the number of path points according to the navigation path and the key points.
For a route needing to be traveled, a point for changing the traveling direction of the vehicle exists, and the point is a key point. However, the drawn navigation icon needs to be in front of the key point to achieve the purpose of reminding the driver of turning, and the point of the navigation road sign, namely the path point, is drawn. A plurality of path points are arranged between adjacent key points, and the minimum distance between the path points is generally 2 meters, so the number of the path points can be obtained by dividing the distance between the paths between the adjacent key points by the minimum distance between the path points. And then the number of path points among the whole route needing navigation can be obtained according to the navigation path and the key points.
And step S212, calculating the distance of the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows.
And calculating the distance between each navigation arrow according to the number of the navigation arrows presented on the UI. By obtaining the distance n between two adjacent waypoints according to the number of waypoints calculated in step S212, where the number of navigation arrows is a, the distance S between the navigation arrows is n/a.
Step S213, generating a path reference point and a keypoint index according to the number of path points and the distance of the navigation arrows.
The distance between the navigation arrows can be obtained through step S212, the path reference points can be obtained through the distance between the navigation arrows and the path points, the first path reference point is determined as a path reference point n1, after the distance between the navigation arrows passes, the second path reference point n2 is obtained, that is, the path reference point n2 is equal to the path reference point n1+ S, the third path reference point n3 and the fourth path reference point n4 are generated step by step, and so on according to the distance between the key points. And integrating the generated path reference points together to form the key point index.
Step S22, the skipped path points in the two adjacent path reference points are counted, and non-critical point data is generated.
The step S21 may obtain the path reference points, where the two adjacent path reference points skip the path points, that is, the skipped path points between the path reference point n1 and the path reference point n2, between the path reference point n2 and the path reference point n3, and between the path reference point n3 and the path reference point n4 are counted in this step. The navigation arrows only need to use the values of the path reference point n1, the path reference point n2, the path reference point n3, and the path reference point n4 in the drawing. The statistical method is performed by a loop, for example, when the total number of path points (n) is looped for (i path reference point nti ═ 0; i < path reference point n; i + +), the total path is counted by adding 1 each time when [ i is not equal to the index of each navigation arrow on the total path (path reference point n1, path reference point n2, path reference point n3, and path reference point n 4).
Step S23, calculating the three-dimensional coordinates of the path reference point based on the non-keypoint data.
The parameter of the latter path reference point for calculating the three-dimensional coordinate is a relative value relative to the current path point, and the calculation cost needs to be saved, so that the three-dimensional coordinate is calculated only by performing matrix operation on the path reference point. The method specifically comprises the following steps:
step S231, generating three-dimensional data according to the navigation data of the previous path reference point and the three-dimensional coordinates of the previous path reference point.
In order to calculate the three-dimensional coordinate of the path reference point n2, the three-dimensional coordinate of the path reference point n1 and navigation parameters such as rotation and displacement of the path reference point n2 with respect to the path reference point n1 need to be acquired, this step is to acquire the three-dimensional coordinate of the path reference point n1 and the navigation parameters, and then generate three-dimensional data from the acquired data.
Step S232, calculating the three-dimensional position of the path reference point according to the three-dimensional data and the non-key point data.
Here, the path reference point n2 is moved relative to the path reference point n1, but the sum of the parameters of the points skipped between the path reference point n2 and the path reference point n1 is the relative parameter of the path reference point n2 point relative to the path reference point n1 point, that is, the non-critical point data generated by the above steps. Since the difference in the parameters between each path point is equal at the time of designing the data, the difference in the count parameters of the skipped points can be used to calculate the three-dimensional coordinates of the path reference point n 2. At the next call, the index of each navigation arrow on the total path is the path reference point n1+ offset, the path reference point n2 is the path reference point n1+ step + offset, the path reference point n3 is the path reference point n2+ step + offset, and the path reference point n4 is the path reference point n3+ step + offset. The offset is non-key point data, and the larger the offset is, the faster the navigation arrow moves.
And step S3, drawing a three-dimensional graph of the navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to the head-up display.
The method comprises the following steps of drawing a three-dimensional graph of a navigation arrow according to a three-dimensional position, and finally projecting the generated three-dimensional graph to a head-up display due to the fact that the method is based on an AR-HUD technology, namely finally projecting the three-dimensional graph drawn by the method to a vehicle.
By the path display method based on AR-HUD navigation, the technical problems that a navigation track cannot be generated in real time according to gradient and rotation degree and a navigation arrow is not attached to an actual road condition in the prior art are solved, and the navigation path attached to the actual road condition is dynamically displayed by a plurality of navigation arrow icons, so that the display is more visual; meanwhile, the method for calculating the navigation data can solve the technical problems that a plurality of sections of paths with different gradients and rotation degrees cannot be calculated and the calculation of the whole path is time-consuming in the prior art, can simulate a complex navigation path with less resource consumption and improve the running speed.
Second embodiment
FIG. 2 is another flowchart illustrating a method of displaying a route based on AR-HUD navigation according to an exemplary embodiment. Referring to fig. 2, a route display method based on AR-HUD navigation in this embodiment includes the following steps:
and step S1, acquiring distance offset of the navigation path and navigation data of corresponding gradient and rotation degree.
And step S2, generating key points of the navigation path according to the navigation data, and calculating the three-dimensional positions of the key points under the vehicle body coordinates.
And step S3, drawing a three-dimensional graph of the navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to the head-up display.
And step S4, constructing a view matrix under the vehicle body coordinates, monitoring the movement track of human eyes, and updating the view matrix according to the movement track of human eyes.
Since the steps S1-S3 are described in detail in the above embodiments, they are not described in detail herein. In step S4, the driver can see the direction of the navigation arrow by constructing the view matrix under the vehicle body coordinates, thereby determining to drive in a different direction. In the driving process of a driver, the eyeball motion of the driver is collected in real time, the motion track of the driver is monitored, the view matrix is updated according to the motion track, and then three-dimensional figures such as a navigation arrow and the like are updated in real time along with the movement of human eyes.
Preferably, step S4 further includes the steps of:
and step S41, constructing a view matrix under the coordinates of the vehicle body through the vehicle-mounted camera. The step is to construct a view matrix, so that subsequent real-time updating is facilitated.
And step S42, acquiring the pupil movement distance of the user and generating the eye movement track. The step is to collect the eye movement, so as to generate the eye movement track.
And step S43, updating the view matrix according to the human eye motion track. The step is to update the view matrix in real time according to the human eye motion trail collected in the previous step, so that the three-dimensional graph can move correspondingly along with the movement of human eyes. The view matrix is a 4x4 matrix calculated by using a function of a GLM (OpenGL Mathesis) library according to the coordinates of the real-time positions of the pupils of the human eyes in a vehicle coordinate system and the internal and external parameters of the camera calibrated in advance, and the view matrix is updated along with the real-time positions of the movement tracks of the pupils of the human eyes.
According to the path display method based on AR-HUD navigation, provided by the invention, besides the technical effects shown in the previous embodiment, the technical problem that the image cannot change along with the change of the eye view angle position in the prior art can be solved by collecting the eye motion track of the driver and updating the view in real time, the navigation path can be calculated according to the real view angle of the driver, and the navigation accuracy is improved.
Third embodiment
Besides the path display method based on AR-HUD navigation provided by the invention, the invention also provides a path display system based on AR-HUD navigation. As shown in fig. 3, the present system includes the following modules:
and the data acquisition module 10 is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree.
And the position calculating module 20 is configured to generate a key point of the navigation path according to the navigation data, and calculate a three-dimensional position of the key point under the vehicle body coordinate.
And the graph generating module 30 is configured to draw a three-dimensional graph of the navigation arrow according to the three-dimensional position, and project the three-dimensional graph to the head-up display.
The position calculation module 20 includes the following units:
and an index calculating unit 21, configured to calculate, according to the keypoints, the number of path points, path reference points, and a keypoint index to be drawn.
And the data counting unit 22 is used for counting skipped path points in two adjacent path reference points and generating non-critical point data.
A coordinate generating unit 23 for calculating the three-dimensional coordinates of the path reference point based on the non-key point data.
The relevant unit configured by the system is used for executing the relevant instructions in the route display method based on the AR-HUD navigation, and is not described herein again because it has been described in detail above.
According to the path display system based on AR-HUD navigation, the navigation data are acquired and calculated through the data acquisition module 10 and the position calculation module 20, the technical problems that a navigation track cannot be generated in real time according to gradient and rotation degree and a navigation arrow is not attached to an actual road condition in the prior art are solved, and the navigation path attached to the actual road condition is dynamically displayed through a plurality of navigation arrow icons, so that the display is more visual; meanwhile, the method for calculating the navigation data by the system can overcome the technical problems that the prior art cannot calculate paths with different slopes and rotation degrees and the calculation of the whole path is time-consuming, can simulate the complicated navigation path with less resource consumption and improves the running speed.
Fourth embodiment
Besides the path display system based on AR-HUD navigation provided by the invention, the invention also provides a path display system based on AR-HUD navigation. As shown in fig. 4, the present system includes the following modules:
and the data acquisition module 10 is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree.
And the position calculating module 20 is configured to generate a key point of the navigation path according to the navigation data, and calculate a three-dimensional position of the key point under the vehicle body coordinate.
And the graph generating module 30 is configured to draw a three-dimensional graph of the navigation arrow according to the three-dimensional position, and project the three-dimensional graph to the head-up display.
And the eye movement updating module 40 is used for constructing a view matrix under the vehicle body coordinates, monitoring the movement track of human eyes and updating the view matrix according to the movement track of human eyes.
Wherein, the eye movement updating module further comprises the following units:
and the matrix building unit 41 is used for building a view matrix under the coordinates of the vehicle body through the vehicle-mounted camera.
And the eye movement detection unit 42 is used for acquiring the moving distance of the pupil of the user and generating the motion trail of the human eye.
And the view updating unit 43 is used for updating the view matrix according to the motion track of human eyes.
The relevant unit configured by the system is used for executing the relevant instructions in the route display method based on the AR-HUD navigation, and is not described herein again because it has been described in detail above.
By the aid of the path display system based on AR-HUD navigation, besides the technical effects of the previous embodiment, the eye movement track of the driver is collected through the eye movement updating module 40, the view is updated in real time, as shown in fig. 5, the technical problem that images cannot change along with changes of the eye view angle position in the prior art can be solved, the navigation path can be calculated according to the real view angle of the driver, and navigation accuracy is improved.
Fifth embodiment
Meanwhile, the invention also provides a computer storage medium. The computer storage medium of the embodiment of the present invention stores a computer program, and the computer program when executed by a processor implements any one of the steps of the above-mentioned route display method based on AR-HUD navigation, and the computer storage medium may adopt any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a local area network (LA Path reference point n) or a wide area network (WA Path reference point n), or the connection may be made to an external computer (e.g., through the Internet using an Internet service provider).
In summary, the above description is only a detailed description of the preferred embodiments of the present invention, and not intended to limit the scope of the present invention. In practical applications, a person skilled in the art can make several modifications according to the technical solution. Any modification, equivalent replacement, partial application, etc. made on the basis of the principle set forth in the present invention shall be included in the scope of protection of the present invention.

Claims (10)

1.一种基于AR-HUD导航的路径显示方法,其特征在于,所述方法具体包括以下步骤:1. a path display method based on AR-HUD navigation, is characterized in that, described method specifically comprises the following steps: S1,获取导航路径的导航数据,所述导航数据包括距离偏移量、对应的坡度和转度;S1, obtain the navigation data of the navigation path, and the navigation data includes the distance offset, the corresponding gradient and the rotation; S2,根据所述导航数据,生成导航路径的关键点,并计算所述关键点在车身坐标下的三维位置;S2, according to the navigation data, generate key points of the navigation path, and calculate the three-dimensional position of the key points in vehicle body coordinates; S3,根据所述三维位置绘制导航箭头的三维图形,并将所述三维图形投射到抬头显示器。S3: Draw a three-dimensional graphic of the navigation arrow according to the three-dimensional position, and project the three-dimensional graphic to the head-up display. 2.根据权利要求1所述的基于AR-HUD导航的路径显示方法,其特征在于,所述S2的步骤具体包括:2. The path display method based on AR-HUD navigation according to claim 1, wherein the step of the S2 specifically comprises: S21,根据所述导航数据,生成导航路径的关键点,计算所需绘制的关键点索引;S21, according to the navigation data, generate the key points of the navigation path, and calculate the key point index to be drawn; S22,对相邻两个所述关键点中跳过的点的进行计数,生成非关键点数据;S22, counting the skipped points in the two adjacent key points to generate non-key point data; S23,根据所述非关键点数据,计算所述关键点的三维坐标。S23: Calculate the three-dimensional coordinates of the key point according to the non-key point data. 3.根据权利要求2所述的基于AR-HUD导航的路径显示方法,其特征在于,所述S21的步骤具体包括:3. The path display method based on AR-HUD navigation according to claim 2, wherein the step of S21 specifically comprises: S211,根据所述导航路径和所述关键点计算路径点的数量;S211, calculate the number of way points according to the navigation path and the key points; S212,根据相邻路径点距离和导航箭头的个数计算所述导航箭头的间距;S212, calculate the spacing of the navigation arrows according to the distance of adjacent waypoints and the number of navigation arrows; S213,通过所述路径点的数量和所述导航箭头的间距生成路径参考点和关键点索引。S213. Generate a path reference point and a key point index according to the number of the path points and the spacing of the navigation arrows. 4.根据权利要求2所述的基于AR-HUD导航的路径显示方法,其特征在于,所述S23的步骤具体包括:4. the path display method based on AR-HUD navigation according to claim 2, is characterized in that, the step of described S23 specifically comprises: S231,根据上一关键点的导航数据,生成上一关键点的三维坐标;S231, generating the three-dimensional coordinates of the last key point according to the navigation data of the last key point; S232,根据所述上一关键点的三维坐标和所述非关键点数据计算所述关键点的三维坐标。S232: Calculate the three-dimensional coordinates of the key point according to the three-dimensional coordinates of the last key point and the non-key point data. 5.根据权利要求1所述的基于AR-HUD导航的路径显示方法,其特征在于,所述方法还包括S4:在所述车身坐标下构造视图矩阵,监测人眼运动轨迹,并根据所述人眼运动轨迹,更新所述视图矩阵。5. The path display method based on AR-HUD navigation according to claim 1, characterized in that, the method further comprises S4: constructing a view matrix under the vehicle body coordinates, monitoring the human eye movement trajectory, and according to the The human eye movement trajectory, and the view matrix is updated. 6.根据权利要求5所述的基于AR-HUD导航的路径显示方法,其特征在于,所述S4的步骤具体包括:6. The path display method based on AR-HUD navigation according to claim 5, wherein the step of the S4 specifically comprises: S41,通过车载摄像头,构造在所述车身坐标下的视图矩阵;S41 , constructing a view matrix under the vehicle body coordinates through a vehicle-mounted camera; S42,采集用户瞳孔移动距离,生成人眼运动轨迹;S42, collect the moving distance of the user's pupil, and generate a human eye movement trajectory; S43,根据所述人眼运动轨迹,更新所述视图矩阵。S43: Update the view matrix according to the human eye movement track. 7.一种基于AR-HUD导航的路径显示系统,其特征在于,所述系统包括以下模块:7. A path display system based on AR-HUD navigation, wherein the system comprises the following modules: 数据获取模块,用于获取导航路径的距离偏移量及对应的坡度和转度的导航数据;The data acquisition module is used to acquire the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation; 位置计算模块,用于根据所述导航数据,生成导航路径的关键点,并计算所述关键点在车身坐标下的三维位置;a position calculation module, configured to generate key points of the navigation path according to the navigation data, and calculate the three-dimensional position of the key points in vehicle body coordinates; 图形生成模块,用于根据所述三维位置绘制导航箭头的三维图形,并将所述三维图形投射到抬头显示器。A graphic generation module, configured to draw a three-dimensional graphic of the navigation arrow according to the three-dimensional position, and project the three-dimensional graphic to the head-up display. 8.根据权利7所述的基于AR-HUD导航的路径显示系统,其特征在,所述位置计算模块包括以下单元:8. The path display system based on AR-HUD navigation according to claim 7, wherein the position calculation module comprises the following units: 索引计算单元,用于计算绘制所述关键点的数量以及所需绘制的关键点索引;an index calculation unit, used to calculate the number of the key points to be drawn and the index of the key points to be drawn; 数据统计单元,用于对相邻两个所述关键点中跳过的点的进行计数,生成非关键点数据;A data statistics unit, used to count the skipped points in the two adjacent key points to generate non-key point data; 坐标生成单元,用于根据所述非关键点数据,计算所述关键点的三维坐标。A coordinate generating unit, configured to calculate the three-dimensional coordinates of the key point according to the non-key point data. 9.根据权利7所述的基于AR-HUD导航的路径显示系统,其特征在,所述系统还包括眼动更新模块,所述眼动更新模块用于在所述车身坐标下构造视图矩阵,监测人眼运动轨迹,并根据所述人眼运动轨迹,更新所述视图矩阵;所述眼动更新模块具体包括以下单元:9. The path display system based on AR-HUD navigation according to claim 7, wherein the system further comprises an eye movement update module, and the eye movement update module is used to construct a view matrix under the vehicle body coordinates, Monitoring the human eye movement trajectory, and updating the view matrix according to the human eye movement trajectory; the eye movement update module specifically includes the following units: 矩阵构建单元,用于通过车载摄像头,构造在所述车身坐标下的视图矩阵;a matrix construction unit for constructing a view matrix under the vehicle body coordinates through the vehicle-mounted camera; 眼动检测单元,用于采集用户瞳孔移动距离,生成人眼运动轨迹;The eye movement detection unit is used to collect the movement distance of the user's pupil and generate the movement trajectory of the human eye; 视图更新单元,用于根据所述人眼运动轨迹,更新所述视图矩阵。A view updating unit, configured to update the view matrix according to the human eye movement track. 10.一种计算机存储介质,所述计算机存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至6任一项所述方法的步骤。10. A computer storage medium storing a computer program, characterized in that, when the computer program is executed by a processor, the steps of the method according to any one of claims 1 to 6 are implemented.
CN201911378417.0A 2019-12-27 2019-12-27 A route display method, system and computer storage medium based on AR-HUD navigation Active CN111121815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911378417.0A CN111121815B (en) 2019-12-27 2019-12-27 A route display method, system and computer storage medium based on AR-HUD navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911378417.0A CN111121815B (en) 2019-12-27 2019-12-27 A route display method, system and computer storage medium based on AR-HUD navigation

Publications (2)

Publication Number Publication Date
CN111121815A true CN111121815A (en) 2020-05-08
CN111121815B CN111121815B (en) 2023-07-07

Family

ID=70504110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911378417.0A Active CN111121815B (en) 2019-12-27 2019-12-27 A route display method, system and computer storage medium based on AR-HUD navigation

Country Status (1)

Country Link
CN (1) CN111121815B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112242009A (en) * 2020-10-19 2021-01-19 浙江水晶光电科技股份有限公司 Display effect fusion method, system, storage medium and main control unit
CN112738487A (en) * 2020-12-24 2021-04-30 北京百度网讯科技有限公司 Image projection method, device, device and storage medium
CN113326758A (en) * 2021-05-25 2021-08-31 青岛慧拓智能机器有限公司 Head-up display technology for remotely controlling driving monitoring video
CN114518117A (en) * 2022-02-24 2022-05-20 北京百度网讯科技有限公司 Navigation method, navigation device, electronic equipment and medium
CN115406462A (en) * 2022-08-31 2022-11-29 重庆长安汽车股份有限公司 Navigation and live-action fusion method and device, electronic equipment and storage medium
CN115683152A (en) * 2022-10-27 2023-02-03 长城汽车股份有限公司 Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
CN116105747A (en) * 2023-04-07 2023-05-12 江苏泽景汽车电子股份有限公司 Dynamic display method for navigation path, storage medium and electronic equipment

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080033642A1 (en) * 2006-06-30 2008-02-07 Aisin Aw Co., Ltd. Navigation apparatuses, methods, and programs
CN101368827A (en) * 2007-08-16 2009-02-18 北京灵图软件技术有限公司 Communication navigation method, apparatus and communication navigation system
US20110093190A1 (en) * 2008-12-18 2011-04-21 Woong-Cherl Yoon Head-up display navigation device, system and method for implementing services
CN105333883A (en) * 2014-08-07 2016-02-17 深圳点石创新科技有限公司 Navigation path and trajectory displaying method and apparatus for head-up display (HUD)
CN106448206A (en) * 2016-11-08 2017-02-22 厦门盈趣科技股份有限公司 Pavement aided navigation system based on Internet of vehicles
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
CN107228681A (en) * 2017-06-26 2017-10-03 上海驾馥电子科技有限公司 A kind of navigation system for strengthening navigation feature by camera
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN108981740A (en) * 2018-06-11 2018-12-11 同济大学 Blind under the conditions of a kind of low visibility drives navigation system and its method
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
CN109525039A (en) * 2018-11-29 2019-03-26 国网新疆电力有限公司昌吉供电公司 A kind of distribution network operation monitoring method and system
WO2019057452A1 (en) * 2017-09-21 2019-03-28 Volkswagen Aktiengesellschaft METHOD, DEVICE AND COMPUTER-READABLE STORAGE MEDIUM WITH INSTRUCTIONS FOR CONTROLLING AN INDICATOR TO AN AUGMENTED REALITY HEAD-UP DISPLAY DEVICE FOR A MOTOR VEHICLE
WO2019097755A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Display device and computer program
CN109883439A (en) * 2019-03-22 2019-06-14 百度在线网络技术(北京)有限公司 A kind of automobile navigation method, device, electronic equipment and storage medium
CN109974734A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 A kind of event report method, device, terminal and storage medium for AR navigation
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
DE102019000901A1 (en) * 2019-02-07 2019-07-25 Daimler Ag Method for displaying navigation instructions in a head-up display of a Krafftfahrzeugs and computer program product
CN110136519A (en) * 2019-04-17 2019-08-16 百度在线网络技术(北京)有限公司 Simulation system and method based on ARHUD navigation
CN110516880A (en) * 2019-08-29 2019-11-29 广州小鹏汽车科技有限公司 Path processing method and system and vehicle

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080033642A1 (en) * 2006-06-30 2008-02-07 Aisin Aw Co., Ltd. Navigation apparatuses, methods, and programs
CN101368827A (en) * 2007-08-16 2009-02-18 北京灵图软件技术有限公司 Communication navigation method, apparatus and communication navigation system
US20110093190A1 (en) * 2008-12-18 2011-04-21 Woong-Cherl Yoon Head-up display navigation device, system and method for implementing services
CN105333883A (en) * 2014-08-07 2016-02-17 深圳点石创新科技有限公司 Navigation path and trajectory displaying method and apparatus for head-up display (HUD)
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
CN106448206A (en) * 2016-11-08 2017-02-22 厦门盈趣科技股份有限公司 Pavement aided navigation system based on Internet of vehicles
CN107228681A (en) * 2017-06-26 2017-10-03 上海驾馥电子科技有限公司 A kind of navigation system for strengthening navigation feature by camera
WO2019057452A1 (en) * 2017-09-21 2019-03-28 Volkswagen Aktiengesellschaft METHOD, DEVICE AND COMPUTER-READABLE STORAGE MEDIUM WITH INSTRUCTIONS FOR CONTROLLING AN INDICATOR TO AN AUGMENTED REALITY HEAD-UP DISPLAY DEVICE FOR A MOTOR VEHICLE
WO2019097755A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Display device and computer program
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN108981740A (en) * 2018-06-11 2018-12-11 同济大学 Blind under the conditions of a kind of low visibility drives navigation system and its method
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109525039A (en) * 2018-11-29 2019-03-26 国网新疆电力有限公司昌吉供电公司 A kind of distribution network operation monitoring method and system
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
DE102019000901A1 (en) * 2019-02-07 2019-07-25 Daimler Ag Method for displaying navigation instructions in a head-up display of a Krafftfahrzeugs and computer program product
CN109883439A (en) * 2019-03-22 2019-06-14 百度在线网络技术(北京)有限公司 A kind of automobile navigation method, device, electronic equipment and storage medium
CN109974734A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 A kind of event report method, device, terminal and storage medium for AR navigation
CN110136519A (en) * 2019-04-17 2019-08-16 百度在线网络技术(北京)有限公司 Simulation system and method based on ARHUD navigation
CN110516880A (en) * 2019-08-29 2019-11-29 广州小鹏汽车科技有限公司 Path processing method and system and vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHANGRAK YOON: "Development of augmented in-vehicle navigation system for Head-Up Display" *
田婧怡: "基于增强现实技术的导航方法的研究与应用" *
鲁云飞: "基于三维视线跟踪的抬头显示系统研究" *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112242009A (en) * 2020-10-19 2021-01-19 浙江水晶光电科技股份有限公司 Display effect fusion method, system, storage medium and main control unit
CN112738487A (en) * 2020-12-24 2021-04-30 北京百度网讯科技有限公司 Image projection method, device, device and storage medium
CN112738487B (en) * 2020-12-24 2022-10-11 阿波罗智联(北京)科技有限公司 Image projection method, device, equipment and storage medium
US11715238B2 (en) 2020-12-24 2023-08-01 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Image projection method, apparatus, device and storage medium
CN113326758A (en) * 2021-05-25 2021-08-31 青岛慧拓智能机器有限公司 Head-up display technology for remotely controlling driving monitoring video
CN114518117A (en) * 2022-02-24 2022-05-20 北京百度网讯科技有限公司 Navigation method, navigation device, electronic equipment and medium
CN115406462A (en) * 2022-08-31 2022-11-29 重庆长安汽车股份有限公司 Navigation and live-action fusion method and device, electronic equipment and storage medium
CN115683152A (en) * 2022-10-27 2023-02-03 长城汽车股份有限公司 Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
CN115683152B (en) * 2022-10-27 2024-12-20 长城汽车股份有限公司 Vehicle navigation guidance method, device and electronic equipment based on coordinate conversion
CN116105747A (en) * 2023-04-07 2023-05-12 江苏泽景汽车电子股份有限公司 Dynamic display method for navigation path, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN111121815B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN111121815B (en) A route display method, system and computer storage medium based on AR-HUD navigation
US11518413B2 (en) Navigation of autonomous vehicles using turn aware machine learning based models for prediction of behavior of a traffic entity
CN111797187B (en) Map data update method, device, electronic device and storage medium
US11656091B2 (en) Content visualizing method and apparatus
US11919545B2 (en) Scenario identification for validation and training of machine learning based models for autonomous vehicles
Chao et al. A survey on visual traffic simulation: Models, evaluations, and applications in autonomous driving
US12051235B2 (en) Machine learning a feature detector using synthetic training data
US10453256B2 (en) Lane boundary detection data generation in virtual environment
US20220065651A1 (en) Method, apparatus, and system for generating virtual markers for journey activities
JP6644742B2 (en) Algorithms and infrastructure for robust and efficient vehicle positioning
CN114518122B (en) Driving navigation method, device, computer equipment, storage medium and computer program product
US20170109458A1 (en) Testbed for lane boundary detection in virtual driving environment
US11014577B2 (en) Method and apparatus for presenting a feedforward cue in a user interface before an upcoming vehicle event occurs
WO2020257723A1 (en) Lidar-based detection of traffic signs for navigation of autonomous vehicles
CN110796856A (en) Vehicle Lane Change Intention Prediction Method and Lane Change Intention Prediction Network Training Method
CN112204343A (en) Visualization of high definition map data
WO2021003487A1 (en) Training data generation for dynamic objects using high definition map data
US20230039738A1 (en) Method and apparatus for assessing traffic impact caused by individual driving behaviors
US10801844B2 (en) Remediating dissimilarities between digital maps and ground truth data via dissimilarity threshold tuning
CN115406462A (en) Navigation and live-action fusion method and device, electronic equipment and storage medium
Li et al. A comparative study of two wayfinding aids for simulated driving tasks–single-scale and dual-scale GPS aids
WO2023211712A1 (en) Generating training data for machine learning based models for autonomous vehicles
Wang Virtual Reality Simulated Augmented Reality Display on Windshields: Improving the Spatial Awareness of Autonomous Car Drivers
HK40068482B (en) Method and apparatus for driving navigation, computer device, storage medium and computer program product
HK40068482A (en) Method and apparatus for driving navigation, computer device, storage medium and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Zeng Fanhua

Inventor after: Wu Yuehong

Inventor after: Sun Xinran

Inventor after: Li Wanchao

Inventor after: Kuang can

Inventor before: Sun Xinran

Inventor before: Li Wanchao

Inventor before: Kuang can

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20230606

Address after: 401147 Building 5-1 #, No. 24, Changhui Road, Yuzui Town, Liangjiang New Area, Chongqing

Applicant after: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Address before: 404100 No.4 diance village, Jiangbei District, Chongqing

Applicant before: Chongqing Lilong technology industry (Group) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A path display method, system, and computer storage medium based on AR-HUD navigation

Granted publication date: 20230707

Pledgee: Societe Generale Limited by Share Ltd. Chongqing branch

Pledgor: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Registration number: Y2024500000002

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20230707

Pledgee: Societe Generale Limited by Share Ltd. Chongqing branch

Pledgor: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Registration number: Y2024500000002

PC01 Cancellation of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A path display method, system, and computer storage medium based on AR-HUD navigation

Granted publication date: 20230707

Pledgee: Societe Generale Limited by Share Ltd. Chongqing branch

Pledgor: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Registration number: Y2025500000082

PE01 Entry into force of the registration of the contract for pledge of patent right