Выделить слова: 


Патент США №

10088553

Автор(ы)

Zeng и др.

Дата выдачи

02 октября 2018 г.


Method of automatic sensor pose estimation



РЕФЕРАТ

A method and sensor system are disclosed for automatically determining object sensor position and alignment on a host vehicle. A radar sensor detects objects surrounding the host vehicle in normal operation. Static objects are identified as those objects with ground speed approximately equal to zero. Vehicle dynamics sensors provide vehicle longitudinal and lateral velocity and yaw rate data. Measurement data for the static objects--including azimuth angle, range and range rate relative to the sensor--along with the vehicle dynamics data, are used in a recursive geometric calculation which converges on actual values of the radar sensor's two-dimensional position and azimuth alignment angle on the host vehicle.


Авторы:

Shuqing Zeng (Sterling Heights, MI), Xian Zhang (Wixom, MI), Xiaofeng F. Song (Novi, MI)

Патентообладатель:

ИмяГородШтатСтранаТип

GM GLOBAL TECHNOLOGY OPERATIONS LLC

Detroit

MI

US

Заявитель:

GM GLOBAL TECHNOLOGY OPERATIONS LLC (Detroit, MI)

ID семейства патентов

59700546

Номер заявки:

15/069,547

Дата регистрации:

14 марта 2016 г.

Prior Publication Data

Document IdentifierPublication Date
US 20170261599 A1Sep 14, 2017

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

B60W 40/105 (20130101); G01S 7/4004 (20130101); G01S 13/08 (20130101); G01S 7/4026 (20130101); G01S 13/87 (20130101); G01S 13/931 (20130101); G01S 13/588 (20130101); B60W 2520/10 (20130101); B60W 2520/12 (20130101); B60W 2520/14 (20130101); G01S 2013/9378 (20130101); G01S 2007/4091 (20130101); G01S 2013/9375 (20130101)

Класс международной патентной классификации (МПК):

G01S 7/40 (20060101); B60W 40/105 (20120101); G01S 13/08 (20060101); G01S 13/58 (20060101)

Использованные источники

[Referenced By]

Патентные документы США

5964822October 1999Alland
7813851October 2010DeMersseman
2007/0182623August 2007Zeng
2010/0017128January 2010Zeng
2011/0153268June 2011Jordan
2012/0290169November 2012Zeng
2013/0218398August 2013Gandhi
2015/0276923October 2015Song et al.
Главный эксперт: Garber; Charles
Assistant Examiner: Sabur; Alia
Уполномоченный, доверенный или фирма: Cantor Colburn LLP


ФОРМУЛА ИЗОБРЕТЕНИЯ



What is claimed is:

1. A method for estimating a pose of an object detection sensor on a host vehicle, said method comprising: providing vehicle dynamics data from a vehicle dynamics module on the host vehicle to a processor, where the vehicle dynamics data includes a longitudinal velocity, a lateral velocity and a yaw rate of the host vehicle; providing object data from an object sensor to the processor, where the object data includes a range, a range rate and an azimuth angle of each detected object relative to the object sensor; identifying, using the processor, static objects from the object data; calculating, using the processor, a longitudinal position and a lateral position of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor; calculating an azimuth orientation of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor, including using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the sine and cosine of the azimuth orientation of the sensor in an unknown vector, and solving for the unknown vector in a quadratic constrained least squares calculation using static object measurements over a period of time; and compensating for the calculated position and azimuth orientation of the object sensor before using the object data in other applications on the host vehicle.

2. The method of claim 1 wherein the static objects are identified as detected objects with a ground speed substantially equal to zero, where a substantially zero ground speed is defined as a ground speed below a predetermined threshold.

3. The method of claim 1 wherein calculating a longitudinal position and a lateral position of the object sensor includes using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the longitudinal position and the lateral position of the sensor in an unknown vector, and solving for the unknown vector in a regression calculation using static object measurements over a period of time.

4. The method of claim 3 wherein the equation is: -{dot over (r)}=(V.sub.x-b.omega.)cos(.theta.+.alpha.)+(V.sub.y+a.omega.)sin(.theta.- +.alpha.) where {dot over (r)} is the range rate of the static object, V.sub.x is the longitudinal velocity of the host vehicle, V.sub.y is the lateral velocity of the host vehicle, .omega. is the yaw rate of the host vehicle, .theta. is the azimuth angle of the static object, .alpha. is the azimuth orientation of the sensor on the host vehicle, a is the longitudinal position of the sensor on the host vehicle and b is the lateral position of the sensor on the host vehicle.

5. The method of claim 1 wherein the equation is: -{dot over (r)}=[(V.sub.x-b.omega.)cos .theta.+(V.sub.y+a.omega.)sin .theta.] cos .alpha.+[-(V.sub.x--b.omega.)sin .theta.+(V.sub.y+a.omega.)cos .theta.] sin .alpha. where {dot over (r)} is the range rate of the static object, V.sub.x is the longitudinal velocity of the host vehicle, V.sub.y is the lateral velocity of the host vehicle, .omega. is the yaw rate of the host vehicle, .theta. is the azimuth angle of the static object, .alpha. is the azimuth orientation of the sensor on the host vehicle, a is the longitudinal position of the sensor on the host vehicle and b is the lateral position of the sensor on the host vehicle.

6. The method of claim 1 wherein the object sensor is a short-range or long-range radar sensor.

7. The method of claim 1 further comprising providing a notification to a vehicle driver or a service center if the calculated position or azimuth orientation of the object sensor is outside of an acceptable pose range.

8. A system for estimating a pose of an object detection sensor on a host vehicle, said system comprising: a vehicle dynamics module which receives data from a plurality of vehicle dynamics sensors and a vehicle data bus, said vehicle dynamics model providing vehicle dynamics data for the host vehicle; one or more object sensors mounted to the host vehicle, where the object sensors detect objects in a vicinity around the host vehicle; and a processor in communication with the vehicle dynamics module and the one or more object sensors, said processor being configured to learn a position and azimuth orientation of the one or more object sensors on the host vehicle using a calculation based on position and motion of static objects detected by the object sensors and the vehicle dynamics data, wherein the processor is configured with an algorithm performing, for each of the one or more object sensors, steps of: receiving the vehicle dynamics data from the vehicle dynamics module, where the vehicle dynamics data includes a longitudinal velocity, a lateral velocity and a yaw rate of the host vehicle; receiving object data from the object sensor, where the object data includes a range, a range rate and an azimuth angle of each detected object relative to the object sensor; identifying static objects from the object data; calculating a longitudinal position and a lateral position of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor; calculating an azimuth orientation of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor, including using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the sine and cosine of the azimuth orientation of the sensor in an unknown vector, and solving for the unknown vector in a quadratic constrained least squares calculation using static object measurements over a period of time; and compensating for the learned position and azimuth orientation of the object sensor before using the object data in other applications on the host vehicle.

9. The system of claim 8 wherein the static objects are identified as detected objects with a ground speed substantially equal to zero, where a substantially zero ground speed is defined as a ground speed below a predetermined threshold.

10. The system of claim 8 wherein calculating a longitudinal position and a lateral position of the object sensor includes using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the longitudinal position and the lateral position of the sensor in an unknown vector, and solving for the unknown vector in a regression calculation using static object measurements over a period of time.

11. The system of claim 10 wherein the equation is: -{dot over (r)}=(V.sub.x-b.omega.)cos(.theta.+.alpha.)+(V.sub.y+a.omega.)sin(.theta.- +.alpha.) where {dot over (r)} is the range rate of the static object, V.sub.x is the longitudinal velocity of the host vehicle, V.sub.y is the lateral velocity of the host vehicle, .omega. is the yaw rate of the host vehicle, .theta. is the azimuth angle of the static object, .alpha. is the azimuth orientation of the sensor on the host vehicle, a is the longitudinal position of the sensor on the host vehicle and b is the lateral position of the sensor on the host vehicle.

12. The system of claim 8 wherein the equation is: -{dot over (r)}=[(V.sub.x-b.omega.)cos .theta.+(V.sub.y+a.omega.)sin .theta.] cos .alpha.+[-(V.sub.x--b.omega.)sin .theta.+(V.sub.y+a.omega.)cos .theta.] sin .alpha. where {dot over (r)} is the range rate of the static object, V.sub.x is the longitudinal velocity of the host vehicle, V.sub.y is the lateral velocity of the host vehicle, .omega. is the yaw rate of the host vehicle, .theta. is the azimuth angle of the static object, .alpha. is the azimuth orientation of the sensor on the host vehicle, a is the longitudinal position of the sensor on the host vehicle and b is the lateral position of the sensor on the host vehicle.

13. The system of claim 8 wherein the one or more object sensors are short-range or long-range radar sensors.

14. A smart sensor which continuously learns its pose on a host vehicle while detecting objects, said smart sensor comprising: a measurement core including a radar transceiver which detects objects in a vicinity around the host vehicle; and a mounting pose estimator including a processor in communication with the measurement core and a vehicle dynamics module on the host vehicle, said processor being configured to learn a position and azimuth orientation of the smart sensor on the host vehicle using a calculation based on position and motion of static objects detected by the measurement core and vehicle dynamics data from the vehicle dynamics module, wherein the processor is configured with an algorithm performing steps of: receiving the vehicle dynamics data from the vehicle dynamics module, where the vehicle dynamics data includes a longitudinal velocity, a lateral velocity and a yaw rate of the host vehicle; receiving object data from the measurement core, where the object data includes a range, a range rate and an azimuth angle of each detected object relative to the smart sensor; identifying static objects from the object data; calculating a longitudinal position and a lateral position of the smart sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the smart sensor; calculating an azimuth orientation of the smart sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the smart sensor, including using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the sine and cosine of the azimuth orientation of the smart sensor in an unknown vector, and solving for the unknown vector in a quadratic constrained least squares calculation using static object measurements over a period of time; and compensating for the learned position and azimuth orientation of the smart sensor before providing the object data for use in other applications on the host vehicle.

15. The smart sensor of claim 14 wherein calculating a longitudinal position and a lateral position of the smart sensor includes using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the longitudinal position and the lateral position of the smart sensor in an unknown vector, and solving for the unknown vector in a regression calculation using static object measurements over a period of time.

16. A method for estimating a pose of an object detection sensor on a host vehicle, said method comprising: providing vehicle dynamics data from a vehicle dynamics module on the host vehicle to a processor, where the vehicle dynamics data includes a longitudinal velocity, a lateral velocity and a yaw rate of the host vehicle; providing object data from an object sensor to the processor, where the object data includes a range, a range rate and an azimuth angle of each detected object relative to the object sensor; identifying, using the processor, static objects from the object data; calculating, using the processor, a longitudinal position and a lateral position of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor, including using an equation defining the range rate of each static object in terms of a longitudinal component and a lateral component, writing the equation in a form which isolates the longitudinal position and the lateral position of the sensor in an unknown vector, and solving for the unknown vector in a regression calculation using static object measurements over a period of time, wherein the equation is: -{dot over (r)}=(V.sub.x-b.omega.)cos(.theta.+.alpha.)+(V.sub.y+a.omega.)sin(.theta.- +.alpha.) where {dot over (r)} is the range rate of the static object, V.sub.x is the longitudinal velocity of the host vehicle, V.sub.y is the lateral velocity of the host vehicle, .omega. is the yaw rate of the host vehicle, .theta. is the azimuth angle of the static object, .alpha. is the azimuth orientation of the sensor on the host vehicle, a is the longitudinal position of the sensor on the host vehicle and b is the lateral position of the sensor on the host vehicle; calculating an azimuth orientation of the object sensor on the host vehicle based on a recursive calculation using position and motion of the static objects relative to the object sensor; and compensating for the calculated position and azimuth orientation of the object sensor before using the object data in other applications on the host vehicle.


ОПИСАНИЕ




ПРЕДПОСЫЛКИ СОЗДАНИЯ ИЗОБРЕТЕНИЯ



1. Field of the Invention

This invention relates generally to object detection sensors on vehicles and, more particularly, to a method for automatically determining an object sensor's pose--including its position and mounting angle on a host vehicle--by using vehicle dynamics data and detected static object data in a recursive calculation.

2. Discussion of the Related Art

Many modern vehicles include object detection sensors, which are used to enable collision warning or avoidance and other active safety applications. The object detection sensors may use any of a number of detection technologies--including short range or long range radar, cameras with image processing, laser or LIDAR, and ultrasound, for example. The object detection sensors detect vehicles and other objects in the path of the host vehicle, and the application software uses the object detection information to issue warnings or take actions as appropriate.

In order for the application software to perform optimally, the object detection sensors must be aligned properly with the vehicle. For example, if a sensor detects an object that is actually in the path of the host vehicle but, due to sensor misalignment, the sensor determines that the object is slightly to the left of the path of the host vehicle, this can have significant consequences for the application software. Similarly, an object sensor's true position on the host vehicle is also important in object detection calculations. Even if there are multiple object detection sensors on a vehicle, it is important that their positions and alignments are known, so as to minimize or eliminate conflicting or inaccurate sensor readings.

In many vehicles, the object detection sensors are integrated directly into the front or rear fascia of the vehicle. This type of installation is simple, effective, and aesthetically pleasing, but it has the disadvantage that there is no practical way to physically adjust the position or alignment of the sensors. Thus, if a sensor becomes misaligned with the vehicle's true heading, due to damage to the fascia or age- and weather-related warping, there has traditionally been no way to correct the misalignment, other than to replace the entire fascia assembly containing the sensors. In other situations, an object sensor may be placed on a vehicle without an accurate determination of the sensor's location and orientation, thus leading to uncertainty in object detection calculations.


СУЩНОСТЬ ИЗОБРЕТЕНИЯ



In accordance with the teachings of the present invention, a method and sensor system are disclosed for automatically determining object sensor position and alignment on a vehicle. A radar sensor detects objects surrounding a host vehicle in normal operation. Static objects are identified as those objects with ground speed approximately equal to zero. Vehicle dynamics sensors provide vehicle longitudinal and lateral velocity and yaw rate data. Measurement data for the static objects--including azimuth angle, range and range rate relative to the sensor--along with the vehicle dynamics data, are used in a recursive geometric calculation which converges on actual values of the radar sensor's two-dimensional position and azimuth alignment angle on the host vehicle.

Additional features of the present invention will become apparent from the following description and appended claims, taken in conjunction with the accompanying drawings.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



FIG. 1 is a top-view illustration of a vehicle, including several object detection sensors, vehicle dynamics sensors and a processor for communicating therebetween;

FIG. 2 is a top-view illustration of a host vehicle with a radar sensor detecting a static object and showing the geometric relationships used in a calculation of the radar sensor's pose on the host vehicle;

FIG. 3 is a schematic diagram of a smart sensor which, given vehicle dynamics data as input, can learn its pose on the host vehicle; and

FIG. 4 is a flowchart diagram of a method for automatic estimation of a radar sensor's pose on the host vehicle.


ПОДРОБНОЕ ОПИСАНИЕ ВАРИАНТОВ ОСУЩЕСТВЛЕНИЯ



The following discussion of the embodiments of the invention directed to automatic object sensor pose estimation is merely exemplary in nature, and is in no way intended to limit the invention or its applications or uses.

Object detection sensors have become commonplace in modern vehicles. Such sensors are used to detect objects which are in or near a vehicle's driving path--forward, rearward or to the side. Many vehicles now integrate object detection sensors into exterior body trim panels in a way that precludes mechanical adjustment of the sensors. A method and system are disclosed herein for automatically calibrating sensor position and alignment in software, thus ensuring accurate sensor readings with no need for mechanical adjustment of the sensors.

FIG. 1 is a top-view illustration of a vehicle 10, including several sensors which can be used for object detection, lane keeping, and other active safety applications. The vehicle 10 includes a processor 12--which may be a dedicated object detection processor, or a processor shared with other vehicle systems and applications. The processor 12 is in communication with a vehicle dynamics module 20 and an object detection module 30. The vehicle dynamics module 20 includes a plurality of vehicle dynamics sensors--such as longitudinal and lateral accelerometers and a yaw rate sensor--and receives data such as vehicle speed from a communications bus. Using the vehicle dynamics sensors and other available data, the vehicle dynamics module 20 continuously computes vehicle dynamics data--including, at a minimum, vehicle longitudinal and lateral velocity, and yaw rate.

The object detection module 30 communicates with one or more object detection sensors. The vehicle 10 is shown with a front center object sensor 32, a rear center object sensor 34, front corner (L and R) object sensors 36 and rear corner (L and R) object sensors 38. The vehicle 10 may include more or fewer object sensors--including additional locations (not shown) such as left and right side sensors (mounted in side view mirrors or door panels, for example). Some or all of the object sensors 32-38 are commonly integrated into a bumper fascia or other body panel of the vehicle 10. The functions of the object detection module 30 may be integrated with the processor 12.

The object sensors 32-38 may be used for detecting objects which define lane or roadway boundaries--such as curbs, guard rails and median walls. The object sensors 32-38 may also be used for detecting other static objects surrounding the roadway--such as trees, light poles, mail boxes and signs. In addition, the object sensors 32-38 are used to detect moving objects in proximity to the vehicle 10--such as other vehicles, pedestrians, bicycles, wildlife, etc. Many different types of object sensor technology are used on modern vehicles--including radar, light detection and ranging (LiDAR), ultrasound, etc. For the purposes of the invention disclosed herein, the object sensors 32-38 can be considered to be radar sensors or any other sensor technology which provides both range and range rate of target objects. The object sensors 32-38 may include long range radar (LRR) and short range radar (SRR) sensors.

Control module, module, control, controller, control unit, processor and similar terms mean any suitable one or various combinations of one or more of Application Specific Integrated Circuit(s) (ASIC), electronic circuit(s), central processing unit(s) (preferably microprocessor(s)) and associated memory and storage (read only, programmable read only, random access, hard drive, etc.) executing one or more software or firmware programs, combinatorial logic circuit(s), input/output circuit(s) and devices, appropriate signal conditioning and buffer circuitry, and other suitable components to provide the described functionality. The processor 12, the vehicle dynamics module 20 and the object detection module 30--which may be separate devices or a single device--are programmed with a set of computational and control algorithms, including resident software program instructions and calibrations stored in memory and executed to provide the desired functions. The algorithms may be executed during preset loop cycles, or in response to occurrence of an event. Algorithms are executed, such as by a central processing unit, and are operable to monitor inputs from sensing devices and other networked control modules, execute diagnostic routines, and control operation of other vehicle systems--such as steering, brakes, display and warning devices, etc.

Regardless of how many object sensors are provided on the vehicle 10, it is important that the pose (position and azimuth orientation) of the sensors is accurately known. Sensor position and orientation are important both for object data used in lane keeping applications (such as curbs and guard rails) and for object data used in collision avoidance and other autonomous driving applications (such as other vehicles). The invention discussed below provides a technique for automatically determining the position and orientation of the object sensors 32-38 on the vehicle 10 using data which is readily available.

FIG. 2 is a top-view illustration of a host vehicle 50 with a left front radar sensor 70 detecting a static object 90 and showing the geometric relationships used in a calculation of the radar sensor's pose on the host vehicle 50. The host vehicle 50 is representative of the vehicle 10 of FIG. 1, and the radar sensor 70 is representative of the left front object sensor 36 of FIG. 1. The calculations discussed below, relative to FIG. 2, are equally applicable to any of the object sensors shown on the vehicle 10, or an object sensor located at any position and orientation on the vehicle 50--including straight forward, straight rearward, directly to the left or right side, or at any vehicle corner.

The vehicle 50 has a center of gravity 52 and a local coordinate system 54 designated as (X',Y'). The vehicle 50 has a longitudinal velocity V.sub.x, denoted by reference numeral 56, and a lateral velocity V.sub.y, denoted by reference numeral 58. The vehicle 50 also has a yaw rate .omega. as shown. The values of V.sub.x, V.sub.y and .omega. are determined and provided by the vehicle dynamics module 20 as described relative to FIG. 1.

The radar sensor 70 is located at a position (a,b) on the host vehicle 50, where a is the longitudinal distance forward of the center of gravity 52, designated by reference numeral 72, and b is the lateral distance to the left of the center of gravity 52, designated by reference numeral 74. The radar sensor 70 has a local coordinate system defined by X-axis 76 and Y-axis 78, where the radar sensor has an azimuth orientation angle .alpha. defined as a counterclockwise rotation of the radar sensor's reference frame relative to the host vehicle's local coordinate system 54. All directional conventions described here are as viewed from above.

Only static objects are used in the pose estimation calculations described below. This is because moving objects, such as other vehicles, have a ground speed which is unknown to the host vehicle 50, and may be changing. Of course, the ground speed of any detected object may be calculated using data from the radar sensor 70, but this object velocity calculation adds extra variables into the computation. By using only static objects, the number of unknowns involved in the sensor pose estimation calculations is reduced such that the pose (a,b,.alpha.) of the radar sensor 70 can be determined through recursive computations over a number of measurement cycles.

Before performing the ongoing recursive pose estimation calculation using each new set of sensor measurement data, an object filtering step is performed to identify the set of static objects. The absolute velocity or ground speed of any object detected by the radar sensor 70 is calculated, using measurement data from the radar sensor 70 and nominal position and orientation values for the radar sensor 70, as would be understood by those skilled in the art. Static objects are identified as those objects having a ground speed of zero or very close to zero, and only these static objects are used in the sensor pose estimation calculations. Although only one static object 90 is shown in FIG. 2 and discussed in the calculations below, it is to be understood that the fidelity of the calculations is improved by using several static objects as would commonly be present in many driving scenarios.

The static object 90 has a position defined by a range r (denoted by reference numeral 92) and an azimuth angle .theta., both of which are measured by the radar sensor 70. The angle .theta. is defined as the angle between the positive X-axis of the radar sensor's local coordinate system and the vector from the radar sensor 70 to the static object 90, as shown. The static object 90 also has a range rate (denoted by reference numeral 94), which is also measured by the radar sensor 70. The range rate {dot over (r)} can be resolved into a pair of orthogonal vectors 96 and 98, where the vector 96 is the apparent longitudinal velocity of the static object 90 relative to the host vehicle 50 and the vector 98 is the apparent lateral velocity of the static object 90 relative to the host vehicle 50.

Using basic geometry and kinematics, the velocity vector 96 can be written as: V.sub.96=V.sub.x-b.omega. (1) and the velocity vector 98 can be written as: V.sub.98=V.sub.y+a.omega. (2) where all variables in Equations (1) and (2) have been defined above.

From the geometric relationships defined in Equations (1) and (2), a pair of calculations can be performed recursively upon arrival of each new set of sensor measurement data. In the first calculation, the azimuth orientation angle .alpha. is assumed to be known (from a default setting, or from a previous cycle of the recursive calculation), and the position values a and b are calculated. In the second calculation, the position values a and b are assumed to be known (from a default setting, or from a previous cycle of the recursive calculation), and the azimuth orientation angle .alpha. is calculated. Over a period of time (nominally one minute to a few minutes), with measurement data arriving several times per second, these calculations converge to yield the actual values of the sensor pose (a,b,.alpha.).

The first calculation, where the azimuth orientation angle .alpha. is assumed to be known and the position values a and b are calculated, can be set up as follows. Equations (1) and (2) and their geometric relationship to the range rate vector {dot over (r)} yield the following: -{dot over (r)}=(V.sub.x-b.omega.)cos(.theta.+.alpha.)+(V.sub.y+a.omega.)sin(.theta.- +.alpha.) (3) where all variables in Equation (3) have been defined above.

Equation (3) can be rewritten as:

.function..theta..alpha..function..theta..alpha..omega..function..functio- n..theta..alpha..times..times..function..theta..alpha..function. ##EQU00001## where all variables in Equation (4) have been defined above, and only the position values a and b are unknown. Equation (4) is advantageously written to separate the unknowns a and b into a vector which can be obtained using regression calculations over a number of measurement cycles.

To set up the regression calculation, it is helpful to define the following:

.function..theta..alpha..function..theta..alpha. ##EQU00002## where {dot over (r)}.sub.i and .theta..sub.i are values of range rate and azimuth angle of the static object 90 for a sensor measurement cycle i, Y is a vector of length n, and n is the number of sensor measurement cycles used in the calculation. The value of n can be chosen as appropriate to achieve convergence of the calculation, while not being so large as to make the computations overly complex. The value of n may be in the thousands, covering 10-20 minutes (more or less) of object sensor data, and in one embodiment a value of n=6000 (10 minutes at 10 sensor measurements per second) was shown to yield good results. The value of n is a configurable parameter which can be selected to achieve the best results in a given implementation. The sensor measurements for each cycle i are stored in a fixed-length buffer (of length n), such that the oldest measurement cycle (the one that is, say, 10 minutes old) drops out of the buffer when a new measurement cycle is received.

Just as Equation (5) defines a vector for the left-hand side of Equation (4), the right-hand side of Equation (4) can be defined as:

.omega..times..times..function..theta..alpha..omega..times..times..functi- on..theta..alpha..times..times..beta. ##EQU00003## where .theta..sub.i is again the value of the azimuth angle of the static object 90 for a measurement cycle i, and X is a matrix of size 2.times.n, where the two columns of X are as shown in Equation (6) and n is the number of sensor measurement cycles used in the calculation.

Substituting Equations (5)-(7) into Equation (4) yields: X.beta.=Y (8)

Equation (8) can be iteratively solved in a regression calculation using static object measurements over a period of time. It should be remembered that, in Equation (8), all of the data in X and Y are known values--.alpha. is given, and all other values come from sensor measurements or from vehicle dynamics. Thus, only the vector .beta. containing the position values a and b is unknown, and can be solved for.

The second calculation, where the position values a and b are assumed to be known and the azimuth orientation angle .alpha. is to be determined, can be set up as follows. First, Equation (3) can be rewritten as: -{dot over (r)}=[(V.sub.x-b.omega.)cos .theta.+(V.sub.y+a.omega.)sin .theta.] cos .alpha.+[-(V.sub.x--b.omega.)sin .theta.+(V.sub.y+a.omega.)cos .theta.] sin .alpha. (9)

To facilitate solving for the orientation angle .alpha., the following definitions can be established:

.times..times..omega..times..times..times..times..theta..times..times..om- ega..times..times..times..times..theta..times..times..omega..times..times.- .times..times..theta..times..times..omega..times..times..times..times..the- ta..times..times..omega..times..times..times..times..theta..times..times..- omega..times..times..times..times..theta..times..times..omega..times..time- s..times..times..theta..times..times..omega..times..times..times..times..t- heta..times..times..alpha..times..times..alpha..times..times..times..times- ..times..times..times. ##EQU00004##

Substituting Equations (10)-(12) into Equation (9), the x vector can be solved through quadratic constrained least squares as follows: min.sub.x.parallel.Ax-c.parallel..sup.2 (13) subject to the constraint: .parallel.x.parallel..sup.2=1 (14)

In the same manner as described above, A and c are populated with known values at each sensor measurement cycle, and the recursive least squares calculation converges on the value of x, yielding the azimuth orientation angle .alpha..

The pose estimation techniques described above have been implemented on vehicles, tested in real world driving conditions, and demonstrated to be effective. These tests included purposely changing the azimuth orientation angle .alpha. by several degrees, driving the vehicle, and observing the calculations converge on the correct value of the new orientation angle .alpha.. Similarly, radar sensors were positioned at corner locations on a vehicle (front and rear, left and right), where actual values of a and b were on the order of 1-2 meters, but the initial default values of a and b were set equal to zero. Again, through normal driving, the calculations converged on the correct values of the sensor position (a,b).

The tests described above demonstrate that the disclosed sensor pose estimation technique can not only adaptively learn of any change in position and orientation, but the technique can also be used to automatically learn a sensor's position and orientation following vehicle assembly or repair, with no extra measurement or calibration steps required. And again, it is emphasized that these pose estimation calculations are performed using vehicle and object data that is already available. Furthermore, although the sensor measurement buffer may be sized for statistical robustness to contain (for example) 10 minutes' worth of measurement data, the tests demonstrated convergence on the actual pose (a,b,.alpha.) in just 3-4 minutes, even when the actual pose was dramatically different than the initial default values.

FIG. 3 is a schematic diagram of a smart sensor 100 which, given vehicle dynamics data as input, can automatically and continuously estimate its pose on the vehicle 10 or the host vehicle 50. The smart sensor 100 includes a sensor measurement core 102, which transmits radar signals and receives radar returns to identify objects in the host vehicle's vicinity. The smart sensor 100 also includes a mounting pose estimator 104, which automatically and continuously computes the mounting pose (a,b,.alpha.) of the smart sensor 100. The mounting pose estimator 104 uses static object data from the sensor measurement core 102, along with vehicle dynamics data, in the sensor pose calculations--as described in detail above. The vehicle dynamics data is provided to the smart sensor 100 on line 112 from a module 110, which may be the vehicle dynamics module 20 shown on FIG. 1 and discussed previously.

The measurement core 102 provides object azimuth angle (.theta.), range (r) and range rate ({dot over (r)}) on line 106--for each detected object--which are corrected to the actual sensor pose. In this way, the smart sensor 100 delivers automatically calibrated object data, without requiring any re-programming of a central object detection module such as the object detection module 30. The smart sensor also outputs, on line 108, the sensor's mounting pose parameters (a,b,.alpha.)--for use in any other required calculations, reports or advisories. For example, the pose parameters may be compared to nominal ranges for each sensor, and out-of-range values may result in a driver notification or service advisory.

The smart sensor 100 of FIG. 3 is one embodiment of the disclosed invention. Another embodiment of the disclosed invention is as shown in FIG. 1, where the processor 12 or the object detection module 30 automatically and continuously estimates the pose of all of the sensors 32-38 on the vehicle 10. It should be clear, to one skilled in the art, how the processor 12 or the object detection module 30 can calculate each sensor's pose based on the available data from the vehicle dynamics module 20 and each sensor's measurement of static objects, using the techniques described in detail above.

FIG. 4 is a flowchart diagram 200 of a method for automatic estimation of a radar sensor's pose on a host vehicle. The method of the flowchart diagram could be programmed as an algorithm on the processor 12 or the object detection module 30 (and applied to all of the vehicle's radar sensors), or programmed in the mounting pose estimator 104 of the smart sensor 100.

Table 1 is provided as a key associated with the flowchart diagram 200 described with reference to FIG. 4, wherein the numerically labeled boxes and the corresponding functions are set forth as follows.

TABLE-US-00001 TABLE 1 BOX # BOX DESCRIPTION/FUNCTION 202 Start 204 Has new object sensor measurement data arrived? 206 Provide vehicle longitudinal and lateral velocities and yaw rate 208 Identify static objects from object data 210 Learn sensor's mounting position values (a, b) 212 Learn sensor's azimuth orientation angle (.alpha.) 214 Recalculate location of objects detected by the sensor

The method begins at start box 202. At decision diamond 204, it is determined whether new object sensor measurement data has arrived. If not, the process loops back above the decision diamond 204 and waits until new data arrival. When new object sensor data arrives, at box 206 the vehicle's velocities (longitudinal V.sub.x and lateral V.sub.y) and yaw rate .omega. are provided. As discussed above, V.sub.x, V.sub.y and .omega. are calculated by the vehicle dynamics module 20 based on data provided by the vehicle dynamics sensors and other vehicle state data available on a communications bus.

At box 208, object data from the radar sensor 70 of FIG. 2 (or the sensors 32-38 of FIG. 1) is acquired, and static objects are determined based on a zero or near-zero ground speed. An object ground speed substantially equal to zero, indicative of a static object, is defined as a ground speed below a threshold, such as 0.05 meters/second. At box 210, the sensor's mounting position values (a,b) are estimated using the recursive learning calculations of Equations (5)-(8). At box 212, the sensor's azimuth orientation angle (.alpha.) is estimated using the recursive learning calculations of Equations (10)-(14). At box 214, the updated values of the sensor's pose (a,b,.alpha.) are stored for usage in the next iteration of the recursive calculation, and the sensor position and orientation values are also used to recalculate the location of objects detected by the sensor (misalignment compensation). Sensor pose values which are significantly outside of expected ranges may also be flagged as a diagnostic trouble code (DTC), communicated to the driver as an alert, transmitted via a telematics system to a repair facility for follow-up, etc.

Following the sensor pose data storage and misalignment compensation at the box 214, the method returns to the decision diamond 204 to await the next sensor data arrival. As discussed in detail previously, the method of the flowchart diagram 200 runs continuously during vehicle operation, producing ever more refined values of each sensor's pose. Even if vehicle damage causes a sudden change in sensor position and/or orientation, or if a new sensor is installed with default pose values, the recursive calculation technique will converge on accurate sensor position and orientation values within just a few minutes of vehicle operation, thus ensuring the accuracy of object detection data used in downstream applications.

The automatic sensor pose estimation method described herein provides a simple and effective way to determine the position and alignment of object detection sensors, including those which have no means of physical adjustment, thus improving the performance of applications which use the sensor data. The pose estimation technique also avoids the expensive replacement of an otherwise usable fascia component in the event of fascia damage resulting in sensor displacement, and avoids the need for a vehicle service visit to reposition or recalibrate sensors which have been displaced.

The foregoing discussion discloses and describes merely exemplary embodiments of the present invention. One skilled in the art will readily recognize from such discussion and from the accompanying drawings and claims that various changes, modifications and variations can be made therein without departing from the spirit and scope of the invention as defined in the following claims.

* * * * *


Яндекс.Метрика