Выделить слова: 


Патент США №

10099610

Автор(ы)

Schofield

Дата выдачи

16 октября 2018 г.


Driver assistance system for a vehicle



РЕФЕРАТ

A driver assistance system for a vehicle includes a vision system, a sensing system and a control. The vision system includes a camera and the sensing system includes a radar sensor. Image data captured by the camera is provided to the control and is processed by an image processor of the control. Responsive to image processing of captured image data, lane markers on the road being traveled along by the equipped vehicle are detected and the control determines a lane being traveled by the equipped vehicle. Radar data generated by the radar sensor is provided to the control, which receives vehicle data relating to the equipped vehicle via a vehicle bus of the equipped vehicle. Responsive at least in part to processing of generated radar data and captured image data, the control detects another vehicle present on the road being traveled along by the equipped vehicle.


Авторы:

Kenneth Schofield (Holland, MI)

Патентообладатель:

ИмяГородШтатСтранаТип

MAGNA ELECTRONICS INC.

Auburn Hills

MI

US

Заявитель:

MAGNA ELECTRONICS INC. (Auburn Hills, MI)

ID семейства патентов

26903905

Номер заявки:

15/289,341

Дата регистрации:

10 октября 2016 г.

Prior Publication Data

Document IdentifierPublication Date
US 20170028916 A1Feb 2, 2017

Отсылочные патентные документы США


Application NumberFiling DatePatent NumberIssue Date
14997831Jan 18, 20169463744
13919483Jan 26, 20169245448
12483996Jun 18, 20138466806
12058155Jun 23, 20097551103
11735782Apr 8, 20087355524
11108474Apr 17, 20077205904
10209173Apr 19, 20056882287
60309022Jul 31, 2001

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

B60Q 1/346 (20130101); B62D 15/029 (20130101); G01S 13/931 (20130101); G08G 1/163 (20130101); G08G 1/167 (20130101); B60R 1/00 (20130101); B60R 11/04 (20130101); G06K 9/00798 (20130101); G06K 9/00805 (20130101); G06K 9/6267 (20130101); H04N 5/247 (20130101); B60Q 9/008 (20130101); B60R 2300/804 (20130101); B60T 2201/08 (20130101); B60T 2201/089 (20130101); G01S 2013/9325 (20130101); G01S 2013/9332 (20130101); G01S 2013/9353 (20130101); G01S 2013/936 (20130101); G01S 2013/9364 (20130101); G01S 2013/9367 (20130101); G01S 2013/9385 (20130101)

Класс международной патентной классификации (МПК):

G08G 1/16 (20060101); B62D 15/02 (20060101); B60Q 1/34 (20060101); B60Q 9/00 (20060101); G01S 13/93 (20060101); B60R 1/00 (20060101); B60R 11/04 (20060101); G06K 9/00 (20060101); G06K 9/62 (20060101); H04N 5/247 (20060101)

Использованные источники

[Referenced By]

Патентные документы США

4200361April 1980Malvano
4214266July 1980Myers
4218698August 1980Bart et al.
4236099November 1980Rosenblum
4247870January 1981Gabel et al.
4249160February 1981Chilvers
4254931March 1981Aikens
4266856May 1981Wainwright
4277804July 1981Robison
4281898August 1981Ochiai
4288814September 1981Talley et al.
4348652September 1982Barnes et al.
4355271October 1982Noack
4357558November 1982Massoni et al.
4381888May 1983Momiyama
4420238December 1983Felix
4431896February 1984Lodetti
4443057April 1984Bauer
4460831July 1984Oettinger et al.
4481450November 1984Watanabe et al.
4491390January 1985Tong-Shen
4512637April 1985Ballmer
4521804June 1985Bendell
4529275July 1985Ballmer
4529873July 1985Ballmer
4532550July 1985Bendell et al.
4546551October 1985Franks
4549208October 1985Kamejima et al.
4571082February 1986Downs
4572619February 1986Reininger
4580875April 1986Bechtel
4600913July 1986Caine
4603946August 1986Kato
4614415September 1986Hyatt
4620141October 1986McCumber et al.
4623222November 1986Itoh
4626850December 1986Chey
4629941December 1986Ellis
4630109December 1986Barton
4632509December 1986Ohmi
4638287January 1987Umebayashi et al.
4645975February 1987Meitzler et al.
4647161March 1987Muller
4653316March 1987Fukuhara
4669825June 1987Itoh
4669826June 1987Itoh
4671615June 1987Fukada
4672457June 1987Hyatt
4676601June 1987Itoh
4690508September 1987Jacob
4692798September 1987Seko et al.
4697883October 1987Suzuki
4701022October 1987Jacob
4713685December 1987Nishimura et al.
4717830January 1988Botts
4727290February 1988Smith
4731669March 1988Hayashi et al.
4741603May 1988Miyagi
4758883July 1988Kawahara et al.
4768135August 1988Kretschmer et al.
4772942September 1988Tuck
4789904December 1988Peterson
4793690December 1988Gahan
4817948April 1989Simonelli
4820933April 1989Hong
4825232April 1989Howdle
4838650June 1989Stewart
4847772July 1989Michalopoulos et al.
4855822August 1989Narendra et al.
4862037August 1989Farber et al.
4867561September 1989Fujii et al.
4871917October 1989O'Farrell et al.
4872051October 1989Dye
4881019November 1989Shiraishi et al.
4882565November 1989Gallmeyer
4886960December 1989Molyneux
4891559January 1990Matsumoto et al.
4892345January 1990Rachael, III
4895790January 1990Swanson et al.
4896030January 1990Miyaji
4907870March 1990Brucker
4910591March 1990Petrossian et al.
4916374April 1990Schierbeek
4917477April 1990Bechtel et al.
4937796June 1990Tendler
4953305September 1990Van Lente et al.
4956591September 1990Schierbeek
4961625October 1990Wood et al.
4967319October 1990Seko
4970653November 1990Kenue
4971430November 1990Lynas
4974078November 1990Tsai
4975703December 1990Delisle et al.
4987357January 1991Masaki
4991054February 1991Walters
5001558March 1991Burley et al.
5003288March 1991Wilhelm
5012082April 1991Watanabe
5016977May 1991Baude et al.
5027001June 1991Torbert
5027200June 1991Petrossian et al.
5044706September 1991Chen
5055668October 1991French
5059877October 1991Teder
5064274November 1991Alten
5072154December 1991Chen
5075768December 1991Wirtz et al.
5086253February 1992Lawler
5096287March 1992Kakinami et al.
5097362March 1992Lynas
5121200June 1992Choi
5124549June 1992Michaels et al.
5130709July 1992Toyama et al.
5148014September 1992Lynam
5166681November 1992Bottesch et al.
5168378December 1992Black
5170374December 1992Shimohigashi et al.
5172235December 1992Wilm et al.
5177606January 1993Koshizawa
5177685January 1993Davis et al.
5182502January 1993Slotkowski et al.
5184956February 1993Langlais et al.
5189561February 1993Hong
5193000March 1993Lipton et al.
5193029March 1993Schofield
5204778April 1993Bechtel
5208701May 1993Maeda
5225827July 1993Persson
5245422September 1993Borcherts et al.
5253109October 1993O'Farrell
5276389January 1994Levers
5285060February 1994Larson et al.
5289182February 1994Brillard et al.
5289321February 1994Secor
5305012April 1994Faris
5307136April 1994Saneyoshi
5309137May 1994Kajiwara
5313072May 1994Vachss
5325096June 1994Pakett
5325386June 1994Jewell et al.
5329206July 1994Slotkowski et al.
5331312July 1994Kudoh
5336980August 1994Levers
5339075August 1994Abst et al.
5341437August 1994Nakayama
5351044September 1994Mathur et al.
5355118October 1994Fukuhara
5374852December 1994Parkes
5386285January 1995Asayama
5394333February 1995Kao
5406395April 1995Wilson et al.
5410346April 1995Saneyoshi et al.
5414257May 1995Stanton
5414461May 1995Kishi et al.
5416313May 1995Larson et al.
5416318May 1995Hegyi
5416478May 1995Morinaga
5424952June 1995Asayama
5426294June 1995Kobayashi et al.
5430431July 1995Nelson
5434407July 1995Bauer et al.
5434927July 1995Brady et al.
5440428August 1995Hegg et al.
5444478August 1995Lelong et al.
5451822September 1995Bechtel et al.
5457493October 1995Leddy et al.
5461357October 1995Yoshioka et al.
5461361October 1995Moore
5467284November 1995Yoshioka et al.
5469298November 1995Suman et al.
5471515November 1995Fossum et al.
5475494December 1995Nishida et al.
5487116January 1996Nakano et al.
5498866March 1996Bendicks et al.
5500766March 1996Stonecypher
5510983April 1996Iino
5515448May 1996Nishitani
5521633May 1996Nakajima et al.
5528698June 1996Kamei et al.
5529138June 1996Shaw et al.
5530240June 1996Larson et al.
5530420June 1996Tsuchiya et al.
5535314July 1996Alves et al.
5537003July 1996Bechtel et al.
5539397July 1996Asanuma et al.
5541590July 1996Nishio
5550677August 1996Schofield et al.
5555312September 1996Shima et al.
5555555September 1996Sato et al.
5568027October 1996Teder
5574443November 1996Hsieh
5581464December 1996Woll et al.
5594222January 1997Caldwell
5612883March 1997Shaffer et al.
5614788March 1997Mullins
5619370April 1997Guinosso
5634709June 1997Iwama
5642299June 1997Hardin et al.
5646612July 1997Byon
5648835July 1997Uzawa
5650944July 1997Kise
5660454August 1997Mori et al.
5661303August 1997Teder
5666028September 1997Bechtel et al.
5668663September 1997Varaprasad et al.
5670935September 1997Schofield et al.
5673019September 1997Dantoni
5675489October 1997Pomerleau
5677851October 1997Kingdon et al.
5680123October 1997Lee
5699044December 1997Van Lente et al.
5699057December 1997Ikeda et al.
5715093February 1998Schierbeek et al.
5724187March 1998Varaprasad et al.
5724316March 1998Brunts
5737226April 1998Olson et al.
5757949May 1998Kinoshita et al.
5760826June 1998Nayer
5760828June 1998Cortes
5760931June 1998Saburi et al.
5760962June 1998Schofield et al.
5761094June 1998Olson et al.
5765116June 1998Wilson-Jones et al.
5781437July 1998Wiemer et al.
5786772July 1998Schofield et al.
5790403August 1998Nakayama
5790973August 1998Blaker et al.
5793308August 1998Rosinski et al.
5793420August 1998Schmidt
5796094August 1998Schofield et al.
5798575August 1998O'Farrell et al.
5835255November 1998Miles
5837994November 1998Stam et al.
5844505December 1998Van Ryzin
5844682December 1998Kiyomoto et al.
5845000December 1998Breed et al.
5848802December 1998Breed et al.
5850176December 1998Kinoshita et al.
5850254December 1998Takano et al.
5867591February 1999Onda
5877707March 1999Kowalick
5877897March 1999Schofield et al.
5878370March 1999Olson
5883739March 1999Ashihara et al.
5884212March 1999Lion
5890021March 1999Onoda
5896085April 1999Mori et al.
5899956May 1999Chan
5914815June 1999Bos
5923027July 1999Stam et al.
5929786July 1999Schofield et al.
5940120August 1999Frankhouse et al.
5949331September 1999Schofield et al.
5956181September 1999Lin
5959367September 1999O'Farrell et al.
5959555September 1999Furuta
5963247October 1999Banitt
5964822October 1999Alland et al.
5971552October 1999O'Farrell et al.
5986796November 1999Miles
5990469November 1999Bechtel et al.
5990649November 1999Nagao et al.
6001486December 1999Varaprasad et al.
6009336December 1999Harris et al.
6020704February 2000Buschur
6031484February 2000Bullinger et al.
6037860March 2000Zander et al.
6037975March 2000Aoyama
6049171April 2000Stam et al.
6057754May 2000Kinoshita et al.
6066933May 2000Ponziana
6084519July 2000Coulling et al.
6087953July 2000DeLine et al.
6097023August 2000Schofield et al.
6097024August 2000Stam et al.
6107939August 2000Sorden
6116743September 2000Hoek
6124647September 2000Marcus et al.
6124886September 2000DeLine et al.
6139172October 2000Bos et al.
6144022November 2000Tenenbaum et al.
6151539November 2000Bergholz et al.
6172613January 2001DeLine et al.
6175164January 2001O'Farrell et al.
6175300January 2001Kendrick
6198409March 2001Schofield et al.
6201642March 2001Bos
6222447April 2001Schofield et al.
6222460April 2001DeLine et al.
6226592May 2001Luckscheiter et al.
6243003June 2001DeLine et al.
6250148June 2001Lynam
6259412July 2001Duroux
6266082July 2001Yonezawa et al.
6266442July 2001Laumeyer et al.
6278377August 2001DeLine et al.
6281806August 2001Smith et al.
6285393September 2001Shimoura et al.
6291906September 2001Marcus et al.
6292752September 2001Franke et al.
6294989September 2001Schofield et al.
6297781October 2001Turnbull et al.
6302545October 2001Schofield et al.
6310611October 2001Caldwell
6311119October 2001Sawamoto et al.
6313454November 2001Bos et al.
6317057November 2001Lee
6320176November 2001Schofield et al.
6320282November 2001Caldwell
6324450November 2001Iwama
6326613December 2001Heslin et al.
6329925December 2001Skiver et al.
6333759December 2001Mazzilli
6341523January 2002Lynam
6353392March 2002Schofield et al.
6360170March 2002Ishikawa et al.
6362729March 2002Hellmann et al.
6363326March 2002Scully
6366213April 2002DeLine et al.
6366236April 2002Farmer et al.
6370329April 2002Teuchert
6388565May 2002Bernhard et al.
6388580May 2002Graham et al.
6396397May 2002Bos et al.
6411204June 2002Bloomfield et al.
6411328June 2002Franke et al.
6420975July 2002DeLine et al.
6424273July 2002Gutta et al.
6428172August 2002Hutzel et al.
6430303August 2002Naoi et al.
6433676August 2002DeLine et al.
6433817August 2002Guerra
6441748August 2002Takagi et al.
6442465August 2002Breed et al.
6477464November 2002McCarthy et al.
6485155November 2002Duroux et al.
6497503December 2002Dassanayake et al.
6498620December 2002Schofield et al.
6513252February 2003Schierbeek et al.
6516664February 2003Lynam
6523964February 2003Schofield et al.
6534884March 2003Marcus et al.
6539306March 2003Turnbull
6547133April 2003DeVries, Jr. et al.
6553130April 2003Lemelson et al.
6559435May 2003Schofield et al.
6574033June 2003Chui et al.
6578017June 2003Ebersole et al.
6587573July 2003Stam et al.
6589625July 2003Kothari et al.
6593565July 2003Heslin et al.
6594583July 2003Ogura et al.
6611202August 2003Schofield et al.
6611610August 2003Stam et al.
6627918September 2003Getz et al.
6631316October 2003Stam et al.
6631994October 2003Suzuki et al.
6636258October 2003Strumolo
6648477November 2003Hutzel et al.
6650233November 2003DeLine et al.
6650455November 2003Miles
6672731January 2004Schnell et al.
6674562January 2004Miles
6678056January 2004Downs
6678614January 2004McCarthy et al.
6680792January 2004Miles
6683969January 2004Nishigaki et al.
6690268February 2004Schofield et al.
6700605March 2004Toyoda et al.
6703925March 2004Steffel
6704621March 2004Stein et al.
6710908March 2004Miles et al.
6711474March 2004Treyz et al.
6714331March 2004Lewis et al.
6717610April 2004Bos et al.
6728623April 2004Takenaga et al.
6735506May 2004Breed et al.
6741377May 2004Miles
6744353June 2004Sjonell
6757109June 2004Bos
6762867July 2004Lippert et al.
6784828August 2004Delcheccolo et al.
6794119September 2004Miles
6795221September 2004Urey
6802617October 2004Schofield et al.
6806452October 2004Bos et al.
6813370November 2004Arai
6822563November 2004Bos et al.
6823241November 2004Shirato et al.
6824281November 2004Schofield
6831261December 2004Schofield et al.
6847487January 2005Burgner
6873253March 2005Veriris
6882287April 2005Schofield
6888447May 2005Hori et al.
6889161May 2005Winner et al.
6891563May 2005Schofield et al.
6906639June 2005Lemelson et al.
6909753June 2005Meehan et al.
6946978September 2005Schofield
6953253October 2005Schofield et al.
6968736November 2005Lynam
6975775December 2005Rykowski et al.
7004593February 2006Weller et al.
7004606February 2006Schofield
7005974February 2006McMahon et al.
7038577May 2006Pawlicki et al.
7046448May 2006Burgner
7062300June 2006Kim
7065432June 2006Moisel et al.
7085637August 2006Breed et al.
7092548August 2006Laumeyer et al.
7116246October 2006Winter et al.
7123168October 2006Schofield
7133661November 2006Hatae et al.
7149613December 2006Stam et al.
7167796January 2007Taylor et al.
7195381March 2007Lynam et al.
7202776April 2007Breed
7205904April 2007Schofield
7224324May 2007Quist et al.
7227459June 2007Bos et al.
7227611June 2007Hull et al.
7249860July 2007Kulas et al.
7253723August 2007Lindahl et al.
7255451August 2007McCabe et al.
7311406December 2007Schofield et al.
7325934February 2008Schofield et al.
7325935February 2008Schofield et al.
7338177March 2008Lynam
7339149March 2008Schofield et al.
7344261March 2008Schofield et al.
7355524April 2008Schofield
7360932April 2008Uken et al.
7370983May 2008DeWind et al.
7375803May 2008Bamji
7380948June 2008Schofield et al.
7388182June 2008Schofield et al.
7402786July 2008Schofield et al.
7423248September 2008Schofield et al.
7423821September 2008Bechtel et al.
7425076September 2008Schofield et al.
7459664December 2008Schofield et al.
7526103April 2009Schofield et al.
7541743June 2009Salmeen et al.
7551103June 2009Schofield
7561181July 2009Schofield et al.
7565006July 2009Stam et al.
7616781November 2009Schofield et al.
7619508November 2009Lynam et al.
7633383December 2009Dunsmoir et al.
7639149December 2009Katoh
7655894February 2010Schofield et al.
7676087March 2010Dhua et al.
7720580May 2010Higgins-Luthman
7792329September 2010Schofield et al.
7843451November 2010Lafon
7855778December 2010Yung et al.
7859565December 2010Schofield et al.
7877175January 2011Higgins-Luthman
7881496February 2011Camilleri
7914187March 2011Higgins-Luthman et al.
7930160April 2011Hosagrahara et al.
7991522August 2011Higgins-Luthman
7994462August 2011Schofield et al.
8017898September 2011Lu et al.
8095310January 2012Taylor et al.
8098142January 2012Schofield et al.
8203440June 2012Schofield et al.
8222588July 2012Schofield et al.
8224031July 2012Saito
8314689November 2012Schofield et al.
8324552December 2012Schofield et al.
8386114February 2013Higgins-Luthman et al.
8466806June 2013Schofield
9245448January 2016Schofield
9463744October 2016Schofield
2001/0031068October 2001Ohta
2001/0034575October 2001Takenaga et al.
2001/0056326December 2001Kirmura
2002/0005778January 2002Breed
2002/0113873August 2002Williams
2002/0116126August 2002Lin
2002/0159270October 2002Lynam et al.
2003/0016143January 2003Ghazarian
2003/0025597February 2003Schofield
2003/0137586July 2003Lewellen
2003/0222982December 2003Hamdan et al.
2004/0016870January 2004Pawlicki et al.
2004/0164228August 2004Fogg et al.
2005/0046978March 2005Schofield et al.
2005/0219852October 2005Stam et al.
2005/0237385October 2005Kosaka et al.
2006/0018511January 2006Stam et al.
2006/0018512January 2006Stam et al.
2006/0050018March 2006Hutzel et al.
2006/0091813May 2006Stam et al.
2006/0103727May 2006Tseng
2006/0250501November 2006Wildmann et al.
2007/0104476May 2007Yasutomi et al.
2007/0109406May 2007Schofield et al.
2007/0120657May 2007Schofield et al.
2007/0242339October 2007Bradley
2008/0147321June 2008Howard et al.
2008/0192132August 2008Bechtel et al.
2009/0113509April 2009Tseng et al.
2009/0160987June 2009Bechtel et al.
2009/0190015July 2009Bechtel et al.
2009/0256938October 2009Bechtel et al.
2012/0045112February 2012Lundblad et al.

Зарубежные патентные документы

0353200Jan 1990EP
0426503May 1991EP
0492591Jul 1992EP
0640903Mar 1995EP
0788947Aug 1997EP
1074430Feb 2001EP
59114139Jul 1984JP
6079889May 1985JP
6080953May 1985JP
6272245May 1987JP
S62131837Jun 1987JP
6414700Jan 1989JP
03099952Apr 1991JP
4114587Apr 1992JP
H04127280Apr 1992JP
0577657Mar 1993JP
05050883Mar 1993JP
5213113Aug 1993JP
6227318Aug 1994JP
06267304Sep 1994JP
06276524Sep 1994JP
06295601Oct 1994JP
07004170Jan 1995JP
0732936Feb 1995JP
0747878Feb 1995JP
07052706Feb 1995JP
0769125Mar 1995JP
07105496Apr 1995JP
2630604Jul 1997JP
200274339Mar 2002JP
2003083742Mar 2003JP
20041658Jan 2004JP
WO1994019212Feb 1994WO
WO1996038319Dec 1996WO

Другие источники


Achler et al., "Vehicle Wheel Detector using 2D Filter Banks," IEEE Intelligent Vehicles Symposium of Jun. 2004. cited by applicant .
Borenstein et al., "Where am I? Sensors and Method for Mobile Robot Positioning", University of Michigan, Apr. 1996, pp. 2, 125-128. cited by applicant .
Bow, Sing T., "Pattern Recognition and Image Preprocessing (Signal Processing and Communications)", CRC Press, Jan. 15, 2002, pp. 557-559. cited by applicant .
Broggi et al., "Automatic Vehicle Guidance: The Experience of the ARGO Vehicle", World Scientific Publishing Co., 1999. cited by applicant .
Broggi et al., "Multi-Resolution Vehicle Detection using Artificial Vision," IEEE Intelligent Vehicles Symposium of Jun. 2004. cited by applicant .
Kastrinaki et al., "A survey of video processing techniques for traffic applications". cited by applicant .
Mei Chen et al., AURORA: A Vision-Based Roadway Departure Warning System, The Robotics Institute, Carnegie Mellon University, published Aug. 9, 1995. cited by applicant .
Parker (ed.), McGraw-Hill Dictionary of Scientific and Technical Terms Fifth Edition (1993). cited by applicant .
Philomin et al., "Pedestrain Tracking from a Moving Vehicle". cited by applicant .
Pratt, "Digital Image Processing, Passage--ED.3", John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771. cited by applicant .
Sun et al., "On-road vehicle detection using optical sensors: a review". cited by applicant .
Tokimaru et al., "CMOS Rear-View TV System with CCD Camera", National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan). cited by applicant .
Van Leeuwen et al., "Motion Estimation with a Mobile Camera for Traffic Applications", IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63. cited by applicant .
Van Leeuwen et al., "Motion Interpretation for In-Car Vision Systems", IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140. cited by applicant .
Van Leeuwen et al., "Real-Time Vehicle Tracking in Image Sequences", IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308. cited by applicant .
Van Leeuwen et al., "Requirements for Motion Estimation in Image Sequences for Traffic Applications", IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272. cited by applicant .
Vellacott, Oliver, "CMOS in Camera," IEE Review, pp. 111-114 (May 1994). cited by applicant .
Vlacic et al., (Eds), "Intelligent Vehicle Tecnologies, Theory and Applications", Society of Automotive Engineers Inc., edited by SAE International, 2001. cited by applicant .
Wang et al., CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK. cited by applicant .
Zheng et al., "An Adaptive System for Traffic Sign Recognition," IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170 (Oct. 1994). cited by applicant.

Главный эксперт: Swarthout; Brent
Уполномоченный, доверенный или фирма: Honigman Miller Schwartz and Cohn, LLP

Текст решения-прецедента




ПЕРЕКРЁСТНАЯ ССЫЛКА НА "РОДСТВЕННЫЕ" ЗАЯВКИ



This application is a continuation of U.S. patent application Ser. No. 14/997,831, filed Jan. 18, 2016, now U.S. Pat. No. 9,463,744, which is a continuation of U.S. patent application Ser. No. 13/919,483, filed Jun. 17, 2013, now U.S. Pat. No. 9,245,448, which is a continuation of U.S. patent application Ser. No. 12/483,996, filed Jun. 12, 2009, now U.S. Pat. No. 8,466,806, which is a continuation of U.S. patent application Ser. No. 12/058,155, filed Mar. 28, 2008, now U.S. Pat. No. 7,551,103, which is a continuation of U.S. patent application Ser. No. 11/735,782, filed Apr. 16, 2007, now U.S. Pat. No. 7,355,524, which is a continuation of U.S. patent application Ser. No. 11/108,474, filed Apr. 18, 2005, now U.S. Pat. No. 7,205,904, which is a continuation of U.S. patent application Ser. No. 10/209,173, filed on Jul. 31, 2002, now U.S. Pat. No. 6,882,287, which claims priority from U.S. provisional application Ser. No. 60/309,022, filed on Jul. 31, 2001, the disclosures of which are hereby incorporated herein by reference in their entireties.

ФОРМУЛА ИЗОБРЕТЕНИЯ



The invention claimed is:

1. A driver assistance system for a vehicle, said driver assistance system comprising: a vision system comprising a first camera disposed at a vehicle equipped with said driver assistance system, said first camera having a field of view forward of the equipped vehicle that encompasses a road being traveled along by the equipped vehicle; wherein the road has at least three lanes comprising (i) a first lane along which the vehicle is traveling, (ii) a second lane immediately adjacent the first lane, and (iii) a third lane immediately adjacent the second lane at the opposite side of the second lane from the first lane; a control disposed at or in the equipped vehicle and comprising a data processor; wherein said data processor comprises an image processor; wherein image data captured by said first camera is provided to said control and is processed by said image processor; wherein said data processor processes data at a processing speed of at least 30 MIPS; wherein, responsive to processing by said image processor of image data captured by said first camera, lane markers on the road being traveled along by the equipped vehicle are detected; wherein, responsive to processing by said image processor of image data captured by said first camera, said control determines the first, second and third lanes of the road being traveled by the equipped vehicle; a sensing system operable to detect other vehicles present exterior the equipped vehicle; wherein said sensing system comprises a second camera disposed at the equipped vehicle and having a field of view at least rearward and sideward of the equipped vehicle as the equipped vehicle travels along the road; wherein said sensing system further comprises a radar sensor disposed at the equipped vehicle and having a field of sensing at least rearward and sideward of the equipped vehicle as the equipped vehicle travels along the road; wherein image data captured by said second camera is provided to said control; wherein radar data generated by said radar sensor is provided to said control; wherein said control receives vehicle data relating to the equipped vehicle via a vehicle bus of the equipped vehicle; wherein, responsive at least in part to processing at said data processor of radar data generated by said radar sensor and of image data captured by said second camera, said control detects another vehicle present on the road being traveled along by the equipped vehicle and approaching the equipped vehicle from its rear; wherein, with the equipped vehicle traveling along the first lane, and responsive at least in part to processing at said data processor of radar data generated by said radar sensor and of image data captured by said second camera, said control determines that the other vehicle is travelling in a next but one lane to the lane the second lane or in the third lane of the road along which the equipped vehicle is travelling; wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the second lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control generates an alert that a lane change by the equipped vehicle from the first lane to the second lane is not safe; and wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the third lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control determines that a lane change from the first lane to the second lane can proceed without hazard of impact with the other vehicle detected travelling in the third lane.

2. The driver assistance system of claim 1, wherein said control is operable to detect a lane change maneuver of the equipped vehicle to a lane adjacent to the lane the equipped vehicle is travelling along.

3. The driver assistance system of claim 2, wherein said radar sensor comprises a Doppler radar sensor.

4. The driver assistance system of claim 2, wherein, responsive at least in part to processing at said data processor of image data captured by said second camera, said control develops a vehicle path history of the other vehicle.

5. The driver assistance system of claim 4, wherein determination by said control that the lane change can proceed without hazard of impact with the other vehicle detected travelling in the third lane utilizes said vehicle path history.

6. The driver assistance system of claim 1, wherein said radar sensor has a field of sensing to a side of and rearward of the equipped vehicle as the equipped vehicle travels along the road.

7. The driver assistance system of claim 6, wherein said control is operable to detect a lane change maneuver of the equipped vehicle from the first lane to the second lane.

8. The driver assistance system of claim 7, wherein detection of the lane change maneuver by said control is based at least in part on vehicle steering data received at said control via said bus.

9. The driver assistance system of claim 7, wherein, responsive to said control detecting the lane change maneuver of the equipped vehicle, said control determines that the lane change can proceed without hazard of impact with the other vehicle detected approaching from the rear in the third lane.

10. The driver assistance system of claim 9, wherein said vehicle bus system comprises at least one of (a) a CAN bus of the vehicle and (b) a LIN bus of the vehicle.

11. The driver assistance system of claim 10, wherein the other vehicle detected in the third lane is overtaking the equipped vehicle.

12. The driver assistance system of claim 11, wherein said control determines a position history of the other vehicle.

13. The driver assistance system of claim 7, wherein detection of the lane change maneuver by said control is irrespective of activation by a driver of the equipped vehicle of a turn signal indicator of the equipped vehicle.

14. The driver assistance system of claim 1, wherein said vision system further comprises at least one other camera disposed at the equipped vehicle and having a field of view exterior the equipped vehicle, and wherein said sensing system further comprises at least one other radar sensor disposed at the equipped vehicle and having a field of sensing exterior the equipped vehicle, and wherein image data captured by said at least one other camera is provided to and processed at said control, and wherein radar data generated by said at least one other radar sensor is provided to and processed at said control.

15. The driver assistance system of claim 1, wherein said camera comprises a forward-viewing camera disposed in an interior cabin of the equipped vehicle at and behind a windshield of the equipped vehicle and viewing through the windshield to capture image data at least forward of the equipped vehicle.

16. The driver assistance system of claim 15, wherein, responsive at least in part to processing of radar data generated by said radar sensor and of image data captured by said camera, said control determines a hazard of collision of the equipped vehicle with another vehicle.

17. The driver assistance system of claim 16, wherein, responsive at least in part to said control determining said hazard of collision with the other vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle that is in hazard of collision with the equipped vehicle.

18. The driver assistance system of claim 17, wherein, responsive at least in part to said control determining said hazard of collision, said control controls activation of at least one of (i) a horn system of the equipped vehicle to mitigate potential collision with the other vehicle that is in hazard of collision with the equipped vehicle and (ii) a lighting system of the equipped vehicle to mitigate potential collision with the other vehicle that is in hazard of collision with the equipped vehicle.

19. The driver assistance system of claim 17, wherein, responsive at least in part to said control determining said hazard of collision, said control controls a wireless transmission system of the equipped vehicle to transmit a safety warning to the other vehicle that is in hazard of collision with the equipped vehicle.

20. A driver assistance system for a vehicle, said driver assistance system comprising: a vision system comprising a first camera disposed at a vehicle equipped with said driver assistance system, said first camera having a field of view forward of the equipped vehicle that encompasses a road being traveled along by the equipped vehicle; wherein the road has at least three lanes comprising (i) a first lane along which the vehicle is traveling, (ii) a second lane immediately adjacent the first lane, and (iii) a third lane immediately adjacent the second lane at the opposite side of the second lane from the first lane; wherein said first camera comprises a forward-viewing camera disposed in an interior cabin of the equipped vehicle at and behind a windshield of the equipped vehicle and viewing through the windshield to capture image data at least forward of the equipped vehicle; a control disposed at or in the equipped vehicle and comprising a data processor; wherein said data processor comprises an image processor; wherein image data captured by said first camera is provided to said control and is processed by said image processor; wherein said data processor processes data at a processing speed of at least 30 MIPS; wherein, responsive to processing by said image processor of image data captured by said first camera, lane markers on the road being traveled along by the equipped vehicle are detected; wherein, responsive to processing by said image processor of image data captured by said first camera, said control determines the first, second and third lanes of the road being traveled by the equipped vehicle; a sensing system operable to detect other vehicles present exterior the equipped vehicle; wherein said sensing system comprises a first radar sensor disposed at the equipped vehicle and having a field of sensing exterior of the equipped vehicle as the equipped vehicle travels along the road; wherein radar data generated by said first radar sensor is provided to said control; wherein said control receives vehicle data relating to the equipped vehicle via a vehicle bus of the equipped vehicle; wherein said vision system further comprises a second camera disposed at the equipped vehicle and having a field of view at least rearward and sideward of the equipped vehicle as the equipped vehicle travels along the road, and wherein said sensing system further comprises a second radar sensor disposed at the equipped vehicle and having a field of sensing at least rearward and sideward the equipped vehicle as the equipped vehicle travels along the road, and wherein image data captured by said second camera is provided to and processed at said control, and wherein radar data generated by said second radar sensor is provided to and processed at said control; wherein, responsive at least in part to processing at said data processor of radar data generated by at least one of said first and second radar sensors of said sensing system and of image data captured by at least one of said first and second cameras of said vision system, said control determines a hazard of collision of the equipped vehicle with another vehicle; wherein, with the equipped vehicle traveling along the first lane, and responsive at least in part to processing at said data processor of radar data generated by said radar sensor and of image data captured by said second camera, said control determines that the other vehicle is travelling in the second lane or in the third lane of the road along which the equipped vehicle is travelling; wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the second lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control generates an alert that a lane change by the equipped vehicle from the first lane to the second lane is not safe; wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the third lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control determines that a lane change from the first lane to the second lane can proceed without hazard of impact with the other vehicle detected travelling in the third lane; and wherein, responsive at least in part to said control determining said hazard of collision with the other vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle.

21. The driver assistance system of claim 20, wherein, responsive at least in part to said control determining said hazard of collision, said control controls activation of at least one of (i) a horn system of the equipped vehicle to mitigate potential collision with the other vehicle that is in hazard of collision with the equipped vehicle and (ii) a lighting system of the equipped vehicle to mitigate potential collision with the other vehicle.

22. The driver assistance system of claim 20, wherein, responsive at least in part to said control determining said hazard of collision, said control controls a wireless transmission system of the equipped vehicle to transmit a safety warning to the other vehicle.

23. The driver assistance system of claim 20, wherein the other vehicle is approaching the equipped vehicle from rearward of the equipped vehicle, and wherein said control is operable to detect a lane change maneuver of the equipped vehicle to the second lane the equipped vehicle is travelling along, and wherein detection of the lane change maneuver by said control is based at least in part on vehicle steering data received at said control via said bus.

24. The driver assistance system of claim 23, wherein radar data generated by at least one of said first and second radar sensors of said sensing system and image data captured by at least one of said first and second cameras of said vision system are fused at said control.

25. The driver assistance system of claim 23, wherein radar data generated by at least one of said first and second radar sensors of said sensing system and image data captured by said first camera of said vision system are fused at said control.

26. The driver assistance system of claim 23, wherein said first radar sensor has a field of sensing to a side of and rearward of the equipped vehicle as the equipped vehicle travels along the road, and wherein radar data generated by said first radar sensor of said sensing system and image data captured by at least one of said first and second cameras of said vision system are fused at said control.

27. The driver assistance system of claim 23, wherein, responsive at least in part to processing at said data processor of at least one of (i) radar data generated by at least one of said first and second radar sensors of said sensing system and (ii) image data captured by at least one of said first and second cameras of said vision system, said control determines that the other vehicle is travelling in the third lane.

28. A driver assistance system for a vehicle, said driver assistance system comprising: a vision system comprising a first camera disposed at a vehicle equipped with said driver assistance system, said first camera having a field of view that encompasses a road being traveled along by the equipped vehicle; wherein said first camera comprises a forward-viewing camera disposed in an interior cabin of the equipped vehicle at and behind a windshield of the equipped vehicle and viewing through the windshield to capture image data at least forward of the equipped vehicle; wherein the road has at least three lanes comprising (i) a first lane along which the vehicle is traveling, (ii) a second lane immediately adjacent the first lane, and (iii) a third lane immediately adjacent the second lane at the opposite side of the second lane from the first lane; a control disposed at or in the equipped vehicle and comprising a data processor; wherein said data processor comprises an image processor; wherein image data captured by said first camera is provided to said control and is processed by said image processor; wherein said data processor processes data at a processing speed of at least 30 MIPS; wherein, responsive to processing by said image processor of image data captured by said first camera, lane markers on the road being traveled along by the equipped vehicle are detected; wherein, responsive to processing by said image processor of image data captured by said first camera, said control determines the first, second and third lanes of the road being traveled by the equipped vehicle; a sensing system operable to detect other vehicles present exterior the equipped vehicle; wherein said sensing system comprises a first radar sensor disposed at the equipped vehicle and having a field of sensing exterior of the equipped vehicle as the equipped vehicle travels along the road; wherein radar data generated by said first radar sensor is provided to said control; wherein said control receives vehicle data relating to the equipped vehicle via a vehicle bus of the equipped vehicle; wherein said first radar sensor has a field of sensing at least to a side of and rearward of the equipped vehicle as the equipped vehicle travels along the road; wherein said sensing system further comprises a second camera disposed at the equipped vehicle and having a field of view at least rearward and sideward of the equipped vehicle as the equipped vehicle travels along the road; wherein image data captured by said second camera is provided to said control; wherein, responsive at least in part to processing at said data processor of radar data generated by said first radar sensor and of image data captured by said second camera, said control detects another vehicle present on the road being traveled along by the equipped vehicle that is approaching the equipped vehicle from rearward of the equipped vehicle; wherein said control is operable to detect a lane change maneuver of the equipped vehicle toward the second lane; wherein detection of the lane change maneuver by said control is based at least in part on vehicle steering data received at said control via said bus; wherein, responsive to said control detecting a lane change maneuver of the equipped vehicle, said control determines that the lane change can proceed without hazard of impact with the other vehicle approaching the equipped vehicle from rearward of the equipped vehicle; wherein, with the equipped vehicle traveling along the first lane, and responsive at least in part to processing at said data processor of radar data generated by said first radar sensor and of image data captured by said second camera, said control determines that the other vehicle is travelling in the second lane or in the third lane of the road along which the equipped vehicle is travelling; wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the second lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control generates an alert that a lane change by the equipped vehicle from the first lane to the second lane is not safe; and wherein, responsive to determination that the other vehicle is approaching the equipped vehicle and is traveling in the third lane, and responsive to said control detecting a lane change maneuver of the equipped vehicle toward the second lane, said control determines that a lane change from the first lane to the second lane can proceed without hazard of impact with the other vehicle detected travelling in the third lane.

29. The driver assistance system of claim 28, wherein said vehicle bus system comprises a CAN bus of the equipped vehicle.

30. The driver assistance system of claim 28, wherein said control determines a position history of the other vehicle approaching the equipped vehicle from rearward of the equipped vehicle.

31. The driver assistance system of claim 28, wherein detection of the lane change maneuver by said control is irrespective of activation by a driver of the equipped vehicle of a turn signal indicator of the equipped vehicle.

32. The driver assistance system of claim 28, wherein said sensing system further comprises a second radar sensor disposed at the equipped vehicle and having a field of sensing exterior the equipped vehicle, and wherein image data captured by said second camera is provided to and processed at said control, and wherein radar data generated by said second radar sensor is provided to and processed at said control.

33. The driver assistance system of claim 32, wherein, responsive at least in part to processing of radar data generated by at least one of said first and second radar sensors of said sensing system and of image data captured by at least one of said first and second cameras of said vision system, said control determines a hazard of collision of the equipped vehicle with the other vehicle approaching the equipped vehicle from rearward of the equipped vehicle.

34. The driver assistance system of claim 33, wherein, responsive at least in part to said control determining said hazard of collision with the other vehicle approaching the equipped vehicle from rearward of the equipped vehicle, said control controls a system of the equipped vehicle to mitigate potential collision with the other vehicle approaching the equipped vehicle from rearward of the equipped vehicle.

35. The driver assistance system of claim 34, wherein radar data generated by at least one of said first and second radar sensors of said sensing system and image data captured by at least one of said first and second cameras of said vision system are fused at said control.

36. The driver assistance system of claim 34, wherein radar data generated by at least one of said first and second radar sensors of said sensing system and image data captured by said first camera of said vision system are fused at said control.

37. The driver assistance system of claim 34, wherein radar data generated by said first radar sensor of said sensing system and image data captured by at least one of said first and second cameras of said vision system are fused at said control.


ОПИСАНИЕ




ОБЛАСТЬ ТЕХНИКИ ИЗОБРЕТЕНИЯ



This invention relates to object detection adjacent a motor vehicle as it travels along a highway, and more particularly relates to imaging systems that view the blind spot adjacent a vehicle and/or that view the lane adjacent the side of a vehicle and/or view the lane behind or forward the vehicle as it travels down a highway.


УРОВЕНЬ ТЕХНИКИ ИЗОБРЕТЕНИЯ



Camera-based systems have been proposed, such as in commonly assigned patent application Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, the disclosure of which is hereby incorporated herein by reference, that detect and display the presence, position of, distance to and rate of approach of vehicles, motorcycles, bicyclists, and the like, approaching a vehicle such as approaching from behind to overtake in a side lane to the vehicle. The image captured by such vehicular image capture systems can be displayed as a real-time image or by icons on a video screen, and with distances, rates of approach and object identifiers being displayed by indicia and/or overlays, such as is disclosed in U.S. Pat. Nos. 5,670,935; 5,949,331 and 6,222,447, the disclosures of which are hereby incorporated herein by reference. Such prior art systems work well. However, it is desirable for a vehicle driver to have visual access to the full 360 degrees surrounding the vehicle. It is not uncommon, however, for a vehicle driver to experience blind spots due to the design of the vehicle bodywork, windows and the rearview mirror system. A blind spot commonly exists between the field of view available to the driver through the exterior rearview mirror and the driver's peripheral limit of sight. Blind Spot Detection Systems (BSDS), in which a specified zone, or set of zones in the proximity of the vehicle, is monitored for the presence of other road users or hazardous objects, have been developed. A typical BSDS may monitor at least one zone approximately one traffic lane wide on the left- or right-hand side of the vehicle, and generally from the driver's position to approximately 10 m rearward. The objective of these systems is to provide the driver an indication of the presence of other road users located in the targeted blind spot.

Imaging systems have been developed in the prior art, such as discussed above, to perform this function, providing a visual, audio or tactile warning to the driver should a lane change or merge maneuver be attempted when another road user or hazard is detected within the monitored zone or zones. These systems are typically used in combination with a system of rearview mirrors in order to determine if a traffic condition suitable for a safe lane change maneuver exists. They are particularly effective when the detected object is moving at a low relative velocity with reference to the detecting vehicle, since the detected object may spend long periods of time in the blind spot and the driver may lose track of surrounding objects. However, prior art systems are inadequate in many driving conditions.

Known lane departure warning systems typically rely on visually detecting markers on the road on both sides of the vehicle for lane center determination. These markers must be fairly continuous or frequently occurring and generally must exist on both sides of the vehicle for the lane center position to be determined. Failure to detect a marker usually means failure of the departure-warning algorithm to adequately recognize a lane change event.


СУЩНОСТЬ ИЗОБРЕТЕНИЯ



The present invention provides a Lane Change Aid (LCA) system wherein the driver of a motor vehicle traveling along a highway is warned if any unsafe lane change or merge maneuver is attempted, regardless of information available through the vehicle's rearview mirror system. The Lane Change Aid (LCA) system of the present invention extends the detection capability of the blind spot detection systems of the prior art.

A vehicle lane change aid system, according to an aspect of the invention, includes a detector that is operative to detect the presence of another vehicle adjacent the vehicle, an indicator for providing an indication that a lane change maneuver of the equipped vehicle may affect the other vehicle and a control receiving movement information of the equipped vehicle. The control develops a position history of the equipped vehicle at least as a function of the movement information. The control compares the detected presence of the other vehicle with the position history and provides the indication when a lane change maneuver may affect the other vehicle.

A vehicle lane change aid system, according to an aspect of the invention, includes an imaging device for capturing lane edge images and a control that is responsive to an output of the imaging device to recognize lane edge positions. The control is operable to distinguish between certain types of lane markers. The control may distinguish between dashed-lane markers and non-dashed-line markers.

A vehicle lane change aid system, according to an aspect of the invention, includes an imaging device for capturing lane edge images and a control that is responsive to an output of the imaging device to recognize lane edge positions. The control is operative to determine that the vehicle has departed a lane. The control may notify the driver that a lane has been departed. The control may further include oncoming vehicle monitoring and side object detection.

A vehicle lane change aid system, according to an aspect of the invention, includes a forward-facing imaging device for capturing images of other vehicles and a control that is responsive to an output of the imaging device to determine an imminent collision with another vehicle. The control may include a wireless transmission channel to transmit a safety warning to the other vehicle. The control may also activate a horn or headlights of the equipped vehicle of an imminent collision.

These and other objects, advantages and features of this invention will become apparent upon review of the following specification in conjunction with the drawings.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



FIGS. 1A-1C are top plan views illustrating a vehicle equipped with a lane change aid system, according to the invention, traveling a straight section of road;

FIG. 2 is a block diagram of a lane change aid system, according to the invention; and

FIG. 3 is a top plan view illustrating a vehicle equipped with a lane change aid system traveling a curved section of road.

DESCRIPTION OF THE PREFERRED EMBODIMENT

Referring to the drawings and the illustrative embodiments depicted therein, a Lane Change Aid (LCA) system 12 of the present invention as illustrated with a vehicle 10 includes a control 18 and an indicator and/or display system 16 that warns a vehicle operator if an intended, or attempted, lane change maneuver could cause an approaching rearward vehicle to brake and decelerate at an unsafe rate, or that otherwise constitutes a highway hazard. In Lane Change Aid (LCA) system 12, the dimension, in the direction of travel, of a zone 20 to be monitored may be calculated based on an assumed maximum relative velocity between a detecting vehicle and an approaching rearward vehicle, and a safe braking and deceleration assumption. Depending on the assumptions made, the required detection zone may vary in length, such as extending rearward from 50 to 100 m, or more. At 100 m, the road curvature behind the vehicle may have a significant impact on the position of the lane of the detected vehicle, relative to the detecting vehicle. Since it is important to know which lane an approaching rearward vehicle is in, relative to the detecting vehicle, in order to provide the driver an appropriate warning, and to avoid many false warnings, the Lane Change Aid (LCA) system 12 includes developing and maintaining a lane position history 20 for the space rearward of the detecting vehicle.

By combining distance traveled with steering angle, the detecting vehicle path may be plotted. Details of the last approximately 100 m traveled are of value for lane change aids and may be stored by the Lane Change Aid (LCA) system. Data may be stored by several methods including the method described below.

Vehicle speed information in the Lane Change Aid (LCA) system 12 is typically derived from a wheel rotation sensor signal 24, which consists of a number of pulses, n, per revolution of the road wheel, and available on a vehicle data bus 26, such as a CAN or LIN bus, or the like. Sensing and signal detail may vary depending on vehicle design, but for any particular design, a distance, d, traveled between pulses can be established. Also, as each pulse is detected, the current value of the steering angle, +/-.alpha., determined by a steering angle encoder 22 may be read from vehicle data bus 26. Again, the sensing and signal detail may vary depending on vehicle design, but, for any particular vehicle design, an effective turning radius, r, for the vehicle can be established.

Image-based blind spot detection devices and lane change aids, generally shown at 14, are but two of a variety of sensing devices and technologies and devices suitable for the purpose of monitoring the local environment in which a vehicle operates. Radar, infrared, sonar, and laser devices are all capable of interrogating the local environment for the presence of other road users or obstacles to be avoided. GPS systems can accurately determine the vehicle position on the earth's surface, and map data can provide detailed information of a mobile local environment. Other wireless communication systems 28 such as short-range wireless communication protocols, such as BLUETOOTH, can provide information such as the position of road works, lane restrictions, or other hazards, which can be translated by on-board vehicle electronics into position data relative to the vehicle position. Lane Change Aid (LCA) system 12 may integrate all the available information from a multiplicity of sensors including non-image-based detectors 14b, such as a radar sensor, such as a Doppler radar sensor, and at least one image-based detector 14a such as a CMOS video camera imaging sensor, and converts the various sensor outputs into a single database with a common format, so that data from various sources, such as a Doppler radar source and a video camera source, may be easily compared, combined and maintained.

Consider a spherical space of radius R, and center (x, y, z)=(0, 0, 0) in Cartesian coordinates or (r, .theta., .beta.=(0,0,0)) in polar coordinates. It is convenient to describe the space in both coordinate systems since several operations will be used to fill the data space and to maintain it and a choice of systems allows for efficient computation methods. Let the center of this space (0, 0, 0) be at the center of the vehicle's rear axle, or nominal rear axle described by the line which passes through the center of the two rear non-steering wheels. Let the horizontal centerline of the vehicle, in the primary direction of travel, lie on (x, 0, 0), such that positive x values describe the space forward of the center of the vehicle's rear axle. Let the rear axle coincide with (0, y, 0), such that positive values of y describe the space to the right of the vehicle centerline when looking forward. (R, 90, 0) describes the positive y axis. Let positive z values describe the space above the centerline of the rear axle. (R, 0, 90) describes the positive z axis. This "sphere of awareness" 20 moves with the vehicle as it moves through space and provides a common frame of reference for all sensed or otherwise derived data concerning the vehicle's local environment.

For the purpose of storing vehicle path data, which may be used to improve the performance of lane change aid 12, the discussion may be simplified by considering only the horizontal plane. The use of polar coordinates simplifies operations used in this application. The first data point, as the vehicle starts with no history, is at point (0, 0). The steering angle is read from the data bus and stored as .alpha..sub.0. When wheel rotation pulse, p.sub.1 is detected, steering angle .alpha..sub.1 is recorded. Since the distance traveled between wheel pulses is known to be d, a new position for the previous data point can be calculated as ([2(1-Cos .alpha..sub.0))]1/2, (180+.alpha..sub.0)). This point is stored and recorded as historical vehicle path data. When pulse p.sub.2 is detected, the above calculation is repeated to yield ([2(1-Cos .alpha..sub.1)]1/2, (180+.alpha..sub.1)) as the new position for the previous data point. This requires the repositioning of the original data to ([2(1-Cos .alpha..sub.0)]1/2+[2(1-Cos .alpha..sub.1)]1/2, [(180+.alpha..sub.0)+.alpha..sub.1]). This process is continued until the distance from the vehicle, R, reaches the maximum required value, such as 100 m in the case of a lane change aid. Data beyond this point is discarded. Thus, a continuous record of the vehicle path for the last 100 m, or whatever distance is used, may be maintained. By maintaining a running record of the path traveled, rearward approaching vehicles detected by a lane change aid image analysis system may be positioned relative to that path as can be seen by comparing the other vehicle 40 in FIGS. 1B and 1C. In FIG. 1B, other vehicle 40 is overlapping zone 20 so an indication of potential conflict may be delayed or discarded. In FIG. 1C, the other vehicle 40 is moving outside of other vehicle 40 and in a blind spot of vehicle 10 so an indication of potential conflict would be given to the driver with indicator 16. Thus, a determination may be made if the approaching vehicle is in the same, adjacent or next but one lane, etc. By this means, the number of inappropriate or unnecessary warnings may be reduced.

Lane change aid system 12 may include a controller, such as a microprocessor including a digital signal processor microcomputer of CPU speed at least about 5 MIPS, more preferably at least about 12 MIPS and most preferably at least about 30 MIPS, that processes inputs from multiple cameras 14a and other sensors 14b and that includes a vehicle path history function whereby, for example, an object, such as a rear-approaching car or motorcycle or truck, or the like, is selected and its presence highlighted to the driver's attention, such as by icons on a dashboard or interior mirror-mounted display, based on the recent history of the side and rear lanes that the host vehicle equipped with the controller of this invention has recently traveled in. An example is over a previous interval of about 60 seconds or less, or over a longer period such as about 3 minutes or more. The vehicle path history function works to determine the lane positioning of an approaching other vehicle, and whether the host vehicle is traveling on, or has recently traveled on, a straight road as illustrated in FIGS. 1A, 1B and 1C, or a curved road portion as illustrated in FIG. 3.

Control 18 may comprise a central video processor module such as is disclosed in commonly assigned provisional patent application Ser. No. 60/309,023, filed Jul. 31, 2001, and utility patent application filed concurrently herewith, now U.S. patent application Ser. No. 10/209,181, filed Jul. 31, 2002, and published Feb. 6, 2003 as U.S. Publication No. US 2003/0025793, the disclosures of which are hereby incorporated herein by reference. Such video processor module operates to receive multiple image outputs from vehicle-mounted cameras, such as disclosed in commonly assigned patent application Ser. No. 09/793,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, the disclosure of which is hereby incorporated herein by reference, and integrates these in a central processing module to allow reaction to the local vehicle environment. Optionally, and when bandwidth limitations exist that limit the ability to send raw image data, particularly high-resolution images, from a remote camera to a central processing unit across robust transmission means, such as a fiber-optic cable or a high-density wireless link, distributed processing can occur, at least local to some of the image capture sensors. In such an at least partial distributed processing environment, the local processors are adapted to preprocess images captured by the local camera or cameras and any other device such as a Doppler radar sensor viewing a blind spot in an adjacent side lane and to format this preprocessed data into a standard format and transmit this standard formatted data. The data can be transmitted via a wired network or a wireless network or over a vehicle bus system, such as a CAN bus and/or a LIN bus, or the like, to the central processor for effective, centralized mapping and combination of the total local environment around the vehicle. This provides the driver with a display of what is happening in both the right and the left side lanes, and in the lane that the host vehicle is itself traveling in.

In this regard, the vehicle can be provided with a dedicated bus and central processor, as described above, for providing a vehicle environment awareness, which can be both internal such as might be provided by interior cabin or trunk monitors/sensors that determine occupant presence, head position and/or movement, eye movement, air bag deployment, microphone aiming, seat positioning, air conditioning and/or heating targeting, audio controls, and the like, or can be external to the vehicle such as in blind spot detecting or lane change detecting. The present invention includes provision of an automatic environment awareness function that comprises automatic gathering of sensor-derived data collection and transmission in a standard format via a vehicle bus network, the data including data relating to the vehicle environment such as the exterior environment, for example, the presence of rear-approaching traffic in side and rear lanes to the host vehicle as captured by rear-facing CMOS or CCD cameras on the side of the host vehicle, such as included in a side view mirror assembly on either or both sides of the host vehicle and/or as detected by a rear lane/side lane-viewing Doppler radar sensor, and preferably includes processing in a central video processing unit.

The information relating to the external environment can be relayed/displayed to the driver in a variety of ways. For example, a blind-spot vehicle-presence indication can be displayed adjacent the exterior mirror assembly, such as inside the vehicle cabin local to where the exterior mirror assembly is attached to the vehicle door so that the indicator display used, typically an LED flashing light source, or the like, is visible to the driver but not visible to any traffic/drivers exterior to the vehicle, but is cognitively associated with the side of the vehicle to which that particular nearby exterior mirror is attached to, and as disclosed in commonly assigned U.S. Pat. Nos. 5,786,772; 5,929,786 and 6,198,409, the disclosures of which are hereby incorporated herein by reference. Optionally, a vibration transducer can be included in the steering wheel that trembles or otherwise vibrates to tactilely warn the driver of the presence of an overtaking vehicle in a side lane that the driver is using the steering wheel to turn the driver's vehicle into where an overtaking or following vehicle may constitute a collision hazard. Hazard warnings can be communicated to the driver by voice commands and/or audible warnings, and/or by heads-up-displays. The coordinate scheme for data collection of the present invention enables an improved blind spot and/or lane change detection system for vehicles and particularly in busy traffic on a winding, curved road.

The present invention includes the fusion of outputs from video and non-video sensors, such as, for example, a CMOS video camera sensor and a Doppler radar sensor, to allow all-weather and visibility side object detection. The present invention includes the fusion of outputs from video and non-video sensors, such as, for example, a CMOS video camera sensor and a Doppler radar sensor, to allow all-weather and visibility side object detection. The present invention can be utilized in a variety of applications such as disclosed in commonly assigned U.S. Pat. Nos. 5,670,935; 5,949,331; 6,222,447; 6,201,642; 6,097,023; 5,715,093; 5,796,094 and 5,877,897 and commonly assigned patent application Ser. No. 09/793,002 filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268, Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, Ser. No. 09/767,939, filed Jan. 23, 2001, now U.S. Pat. No. 6,590,719, Ser. No. 09/776,625, filed Feb. 5, 2001, now U.S. Pat. No. 6,611,202, Ser. No. 09/799,993, filed Mar. 6, 2001, now U.S. Pat. No. 6,538,827, Ser. No. 09/493,522, filed Jan. 28, 2000, now U.S. Pat. No. 6,426,492, Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, Ser. No. 08/952,026, filed Nov. 19, 1997, now U.S. Pat. No. 6,498,620, Ser. No. 09/227,344, filed Jan. 8, 1999, now U.S. Pat. No. 6,302,545, International Publication No. WO 96/38319, published Dec. 5, 1996, and International Publication No. WO 99/23828, published May 14, 1999, the disclosures of which are collectively incorporated herein by reference.

Lane change aid system 12 may include a lane marker type recognition algorithm, or capability 32. Lane marker type recognition capability 32 utilizes classifying lane markers as one of many specific types for the purpose of interpreting the original purpose of the lane marker and issuing reliable and meaningful warnings based on this interpretation. As an example, a double line on the left side of a left-hand drive vehicle typically indicates a no-encroachment zone or no passing zone. A solid line with adjacent dashed line will indicate either an ability to pass safely if the dashed line is on the near side of the solid line or a do not encroach zone if the dashed line is on the far side of the solid line. Road edges can be distinctly recognized and classified as no-encroachment zones. Conversely, dashed lines may have no significance to lane departure warning algorithms since they merely indicate lane edge positions. Recognizing dashed lines as such gives the ability to not initiate nuisance warnings. The recognition algorithm can further be enhanced by recognizing road features when lane markers are too weak or missing. Features, such as curbs, road seams, grease or rubber slicks, road signs, vehicles in same, neighboring, and/or opposing lanes when recognized, could be used to interpret lane-vehicle positioning and issue intelligent warning alerts to the driver. Fewer false or nuisance type warnings with improved real warning functionality and speed can be realized with this improvement. Operation under difficult lighting and environmental conditions can be extended.

Note that collision avoidance functionality 34 can optionally be achieved using a forward-facing camera 14a in the present invention. For example, should the forward-looking camera detect an oncoming car likely to collide with the vehicle equipped with the present invention, or if another vehicle tries to pull in front of it, the system of the present invention can issue a warning (visual and/or audible) to one or both drivers involved. Such warning can be flash headlights and/or sound car horn. Similarly, the system can detect that the driver of the vehicle equipped with the present invention is failing to recognize a stop sign and/or a signal light, or some other warning sign and the driver can be warned visually, such as with a warning light at the interior mirror in the vehicle cabin, or audibly, such as via a warning beeper, or tactilely, such as via a rumble/vibration transducer that vibrates the steering wheel to alert the driver of a potential hazard.

System 12 may also include a lane departure warning algorithm, or system 36. For example, when a left-hand drive vehicle equipped with system 10 is making a left-hand turn generally across a line on the road. System 36 can monitor for a lane crossing and combine it with detection of an oncoming vehicle. The system 12 may also calculate closing speed for warning of potential impact of closing vehicles.

Also, the vehicle can be provided on its front fender or elsewhere at the front of the vehicle with a side-looking camera as an image-based detector 14a operable to warn the driver when he/she is making a left turn across lanes of traffic coming from his/her left (left-side warning) and then again when he/she is about to enter traffic lanes with traffic coming from his right (right-side warning). While executing this turn, the system of the present invention may utilize the detection of the lane markers when the driver's car is about to enter the specific lane combined with oncoming vehicle detection as a means of predictive warning before he actually enters the danger zone.

System 12 is also capable of performing one or more vehicle functions 30. For example, should the lane departure warning system 36 detect that the vehicle equipped with the system is intending to make or is making a lane change and the driver has neglected to turn on the appropriate turn signal indicators, then the system performs a vehicle function 30 of automatically turning on the turn signals on the appropriate side of the vehicle.

The lane departure warning system 36 of the present invention is operable to differentiate between solid and dashed lines and double lines on the road being traveled. Also, should the vehicle be equipped with a side object detection (SOD) system such as a Doppler radar unit or a camera vision side object detection system that detects the presence of overtaking vehicles in the adjacent side lane, then the SOD system can work in conjunction with the lane departure warning system such that as the lane departure system detects that the driver is making a lane change into a side lane when the SOD system detects an overtaking vehicle in that same side lane, then the driver is alerted and warned of the possible hazard, such as by a visual, audible and/or tactile alert.

As indicated above, the forward-facing camera can include stoplight or sign detection, and the system can further include a broadcast with wireless communication system 28 on a safety warning band when the forward-facing camera detects the stoplight or sign and determines the vehicle is not going to stop based on current speed and deceleration. This would warn crossing drivers of an unsafe condition. Such alerts can dynamically vary depending on road surface conditions (wet, snow, ice, etc.) as visually detected and determined by the forward-facing, road-monitoring camera. For example, wet or snowy roads would change the distance and/or speed at which it would warn based on camera vision recognition of stoplights and/or stop signs. When approaching a stoplight when it changes or the vehicle does not slow down for the light after the driver was warned, the system can blow the horn and/or flash the lights to warn vehicles at the stoplight of the oncoming vehicle. The car may also broadcast one of the safety alerts radar detectors pick up.

Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

* * * * *


Яндекс.Метрика