Выделить слова: 


Патент США №

10118618

Автор(ы)

Pawlicki и др.

Дата выдачи

06 ноября 2018 г.


Vehicular control system using cameras and radar sensor



РЕФЕРАТ

A control system for a vehicle includes a plurality of cameras, at least one radar sensor, and a control having at least one processor. Captured image data and sensed radar data are provided to the control. The control processes captured image data to detect objects present exteriorly of the vehicle and is operable to determine whether a detected edge constitutes a portion of a vehicle. The control processes sensed radar data to detect objects present exteriorly of the vehicle. The control, based at least in part on processing of (i) captured image data and/or (ii) sensed radar data, detects another vehicle and determines distance from the equipped vehicle to the detected other vehicle. The control, based at least in part on determination of distance from the equipped vehicle to the detected other vehicle, may control a steering system operable to adjust a steering direction of the equipped vehicle.


Авторы:

John A. Pawlicki (Warren, MI), Martha A. McMahon (Ann Arbor, MI), Steven G. Chinn (Rochester Hills, MI), Joel S. Gibson (Linden, MI)

Патентообладатель:

ИмяГородШтатСтранаТип

Magna Electronics Inc.

Auburn Hills

MI

US

Заявитель:

MAGNA ELECTRONICS INC. (Auburn Hills, MI)

ID семейства патентов

29406802

Номер заявки:

15/830,114

Дата регистрации:

04 декабря 2017 г.

Prior Publication Data

Document IdentifierPublication Date
US 20180105176 A1Apr 19, 2018

Отсылочные патентные документы США


Application NumberFiling DatePatent NumberIssue Date
15413462Jan 24, 20179834216
15155350Jan 31, 20179555803
14922640May 9, 20179643605
14195137Oct 27, 20159171217
13651659Mar 4, 20148665079
12559856Oct 16, 20128289142
12329029Mar 16, 20107679498
11408776Dec 9, 20087463138
10427051May 2, 20067038577
60433700Dec 16, 2002
60377524May 3, 2002

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

B60W 30/18 (20130101); B60Q 1/525 (20130101); B60Q 9/008 (20130101); B60T 7/22 (20130101); G01S 11/12 (20130101); G06K 9/00791 (20130101); G06K 9/00798 (20130101); G08G 1/16 (20130101); G06K 9/00825 (20130101); G08G 1/167 (20130101); G06K 9/52 (20130101); G06T 7/60 (20130101); G06K 9/00818 (20130101); G06K 9/00805 (20130101); B60W 30/16 (20130101); B60W 30/143 (20130101); B60W 30/09 (20130101); B60W 10/20 (20130101); B60W 10/04 (20130101); B60R 11/04 (20130101); G06K 9/4604 (20130101); B60W 30/12 (20130101); B62D 15/025 (20130101); H04N 5/247 (20130101); G06T 7/73 (20170101); G06T 7/13 (20170101); B60W 50/14 (20130101); G06T 7/174 (20170101); G01S 13/931 (20130101); G01S 13/867 (20130101); B60K 31/0008 (20130101); B60K 31/00 (20130101); B60T 2201/08 (20130101); B60T 2201/082 (20130101); B60T 2201/089 (20130101); B60T 2210/34 (20130101); B60W 2550/12 (20130101); B60W 2550/306 (20130101); G06T 2207/30244 (20130101); G06T 2207/30256 (20130101); B60W 2720/10 (20130101); B60W 2710/20 (20130101); B60W 2550/308 (20130101); B60W 2420/42 (20130101); B60W 2550/406 (20130101); B60W 2550/30 (20130101); B60W 2420/52 (20130101); B60W 2050/143 (20130101); G06T 2207/30261 (20130101); B60R 2011/004 (20130101); G06K 2209/27 (20130101)

Класс международной патентной классификации (МПК):

B60Q 1/00 (20060101); G08G 1/16 (20060101); B60R 11/04 (20060101); B60W 10/04 (20060101); B60W 10/20 (20060101); B60W 30/09 (20120101); B60W 30/14 (20060101); B60W 30/16 (20120101); G06K 9/46 (20060101); G06K 9/52 (20060101); G06T 7/60 (20170101); B60W 30/12 (20060101); B62D 15/02 (20060101); H04N 5/247 (20060101); G06T 7/73 (20170101); G06T 7/13 (20170101); G06T 7/174 (20170101); B60W 50/14 (20120101); G01S 13/93 (20060101); B60Q 9/00 (20060101); B60T 7/22 (20060101); G06K 9/00 (20060101); B60Q 1/52 (20060101); B60K 31/00 (20060101); B60W 30/18 (20120101); B60R 11/00 (20060101)

Область поиска:

;340/435,425.5,901,461,438,937,935,938,436 ;359/601,604,603,265 ;348/148,151

Использованные источники

[Referenced By]

Патентные документы США

1472509October 1923Bitter
2074251March 1937Braun
2148119February 1939Grist
2240843May 1941Gillespie
2317400April 1943Paulus et al.
2331144October 1943Sitter
2339291January 1944Paulus et al.
2424288July 1947Severy
2598420May 1952Onksen, Jr. et al.
2632040March 1953Rabinow
2827594March 1953Rabinow
2750583June 1956McCullough
2762932September 1956Falge
2907920October 1956McIlvane
2855523October 1958Berger
2856146October 1958Lehder
2863064December 1958Rabinow
2892094June 1959Lehovec
2912593November 1959Deuth
2934676April 1960Miller
2959709November 1960Vanaman et al.
3008532November 1961Reed
3011580December 1961Reid
3069654December 1962Hough
3085646April 1963Paufve
3158835November 1964Hipkins
3172496March 1965Rabinow et al.
3179845April 1965Kulwiec
3201750August 1965Morin
3208070September 1965Boicey
3249761May 1966Baumanns
3271577September 1966Miller et al.
3325680June 1967Amacher
3367616February 1968Bausch et al.
3411843November 1968Moller
3486066December 1969Jones et al.
3515472June 1970Schwitzgebel
3572428March 1971Monaco
3623671November 1971Hargroves
3673560June 1972Barsh et al.
3680951August 1972Jordan et al.
3689695September 1972Rosenfield et al.
3708668January 1973Tilley
3751711August 1973Schick
3845572November 1974McCanney
3876940April 1975Wickord et al.
3971065July 1976Bayer
3985424October 1976Steinacher
3986022October 1976Hyatt
4003445January 1977De Bruine
4037134July 1977Loper
4044853August 1977Melke
4049961September 1977Marcy
4058796November 1977Oishi et al.
4093364June 1978Miller
4127778November 1978Leitz
4139801February 1979Linares
4143264March 1979Gilbert et al.
4176728December 1979Otteblad et al.
4200361April 1980Malvano et al.
4209853June 1980Hyatt
4214266July 1980Myers
4218698August 1980Bart et al.
4236099November 1980Rosenblum
4238778December 1980Ohsumi
4243196January 1981Toda et al.
4247870January 1981Gabel et al.
4249160February 1981Chilvers
4254931March 1981Aikens
4257703March 1981Goodrich
4266856May 1981Wainwright
4277804July 1981Robison
4278142July 1981Kono
4281898August 1981Ochiai
4288814September 1981Talley et al.
RE30835December 1981Giglia
4348652September 1982Barnes et al.
4348653September 1982Tsuzuki et al.
4355271October 1982Noack
4357558November 1982Massoni et al.
4357594November 1982Ehrlich et al.
4381888May 1983Momiyama
4389537June 1983Tsunoda et al.
4389639June 1983Torii et al.
4390742June 1983Wideman
4390895June 1983Sato et al.
4401181August 1983Schwarz
4403208September 1983Hodgson et al.
4420238December 1983Felix
4431896February 1984Lodetti
4441125April 1984Parkinson
4443057April 1984Bauer et al.
4460831July 1984Oettinger et al.
4464789August 1984Sternberg
4471228September 1984Nishizawa et al.
4481450November 1984Watanabe et al.
4483011November 1984Brown
4485402November 1984Searby
4491390January 1985Tong-Shen
4495589January 1985Hirzel
4512637April 1985Ballmer
4521804June 1985Bendell
4529275July 1985Ballmer
4529873July 1985Ballmer et al.
4532550July 1985Bendell et al.
4538181August 1985Taylor
4546551October 1985Franks
4549208October 1985Kamejima et al.
4564833January 1986Seko et al.
4566032January 1986Hirooka et al.
4571082February 1986Downs
4572619February 1986Reininger et al.
4580875April 1986Bechtel et al.
4587522May 1986Warren
4588041May 1986Tsuchuhashi
4599544July 1986Martin
4600913July 1986Caine
4601053July 1986Grumet
4603946August 1986Kato et al.
4614415September 1986Hyatt
4620141October 1986McCumber
4623222November 1986Ito et al.
4625329November 1986Ishikawa et al.
4626850December 1986Chey
4629941December 1986Ellis et al.
4630109December 1986Barton
4632509December 1986Ohmi et al.
4638287January 1987Umebayashi et al.
4645320February 1987Muelling et al.
4645975February 1987Meitzler et al.
4647161March 1987Muller
4647975March 1987Alston et al.
4653316March 1987Fukuhara
4665321May 1987Chang et al.
4669825June 1987Itoh et al.
4671614June 1987Catalano
4671615June 1987Fukada et al.
4672457June 1987Hyatt
4676601June 1987Itoh et al.
4679077July 1987Yuasa et al.
4681431July 1987Sims et al.
4688085August 1987Imaide
4690508September 1987Jacob
4692798September 1987Seko et al.
4693788September 1987Berg et al.
4697883October 1987Suzuki et al.
4699484October 1987Howell et al.
4701022October 1987Jacob
4701613October 1987Watanbe et al.
4713685December 1987Nishimura et al.
4717830January 1988Botts
4727290February 1988Smith et al.
4728804March 1988Norsworthy
4731669March 1988Hayashi et al.
4731769March 1988Schaefer et al.
4741603May 1988Miyagi et al.
4755664July 1988Holmes et al.
4758883July 1988Kawahara et al.
4768135August 1988Kretschmer et al.
4772942September 1988Tuck
4779095October 1988Guerreri
4785280November 1988Fubini et al.
4789904December 1988Peterson
4793690December 1988Gahan et al.
4799267January 1989Kamejima et al.
4805015February 1989Copeland
4816828March 1989Feher
4817948April 1989Simonelli
4820933April 1989Hong
4825232April 1989Howdle
4833469May 1989David
4833534May 1989Paff et al.
4838650June 1989Stewart et al.
4839749June 1989Franklin
4841348June 1989Shizukuishi et al.
4843463June 1989Michetti
4847489July 1989Dietrich
4847772July 1989Michalopoulos et al.
4849731July 1989Melocik
4855822August 1989Narendra et al.
4859031August 1989Berman et al.
4862037August 1989Farber et al.
4863130September 1989Marks, Jr.
4867561September 1989Makino et al.
4870264September 1989Beha
4871917October 1989O'Farrell et al.
4872051October 1989Dye
4881019November 1989Shiraishi et al.
4882466November 1989Friel
4882565November 1989Gallmeyer
4883349November 1989Mittelhauser
4884055November 1989Memmola
4886960December 1989Molyneux et al.
4891559January 1990Matsumoto et al.
4892345January 1990Rachael
4895790January 1990Swanson et al.
4896030January 1990Miyaji
4899296February 1990Khattak
4900133February 1990Berman
4905151February 1990Weiman et al.
4906940March 1990Green et al.
4907870March 1990Brucker
4910591March 1990Petrossian et al.
4916374April 1990Schierbeek et al.
4917477April 1990Bechtel et al.
4926346May 1990Yokoyama
4930742June 1990Schofield et al.
4931937June 1990Kakinami et al.
4936533June 1990Adams et al.
4937796June 1990Tendler
4948246August 1990Shigematsu
4949186August 1990Peterson
4953305September 1990Van Lente et al.
4954962September 1990Evans et al.
4956591September 1990Schierbeek et al.
4961625October 1990Wood et al.
4963788October 1990King et al.
4966441October 1990Conner
4967319October 1990Seko
4970509November 1990Kissinger
4970589November 1990Hanson
4970653November 1990Kenue
4971405November 1990Hwang
4971430November 1990Lynas
4974078November 1990Tsai
4975703December 1990Delisle et al.
4985847January 1991Shioya et al.
4987357January 1991Masaki
4987410January 1991Berman et al.
4991054February 1991Walters
5001558March 1991Burley et al.
5003288March 1991Wilhelm
5003339March 1991Kikuchi et al.
5008739April 1991D'Luna et al.
5008946April 1991Ando
5012082April 1991Watanabe
5012092April 1991Kobayashi
5012335April 1991Cohodar
5016977May 1991Baude et al.
5020114May 1991Fujioka et al.
5027001June 1991Torbert
5027104June 1991Reid
5027200June 1991Petrossian et al.
5031101July 1991Kamimura et al.
5036437July 1991Macks
5044706September 1991Chen
5044956September 1991Behensky et al.
5050966September 1991Berman
5051906September 1991Evans, Jr. et al.
5055668October 1991French
5059877October 1991Teder
5059947October 1991Chen
5063603November 1991Burt
5064274November 1991Alten
5072154December 1991Chen
5075768December 1991Wirtz et al.
5080207January 1992Horneffer
5080309January 1992Ivins
5081585January 1992Kurami et al.
5086253February 1992Lawler
5086510February 1992Guenther et al.
5087969February 1992Kamada et al.
5096287March 1992Kakinami et al.
5097362March 1992Lynas
5100093March 1992Rawlinson
5101351March 1992Hattori
5111289May 1992Lucas et al.
5113721May 1992Polly
5115398May 1992De Jong
5121200June 1992Choi
5122957June 1992Hattori
5124549June 1992Michaels et al.
5128769July 1992Ari
5130709July 1992Toyama et al.
5133605July 1992Nakamura
5137238August 1992Hutten
5139327August 1992Tanaka
5144685September 1992Nasar et al.
5146340September 1992Dickerson
5148014September 1992Lynam
5153760October 1992Ahmed
5155426October 1992Kurami
5155775October 1992Brown
5159557October 1992Ogawa
5160780November 1992Ono et al.
5160971November 1992Koshizawa et al.
5161632November 1992Asayama et al.
5162841November 1992Terashita
5162861November 1992Tamburino
5163002November 1992Kurami
5165108November 1992Asayama
5166681November 1992Bottesch et al.
5168355December 1992Asayama
5168378December 1992Black et al.
5170374December 1992Shimohigashi et al.
5172235December 1992Wilm et al.
5172317December 1992Asanuma et al.
5173881December 1992Sindle
5177462January 1993Kajiwara
5177606January 1993Koshizawa
5177685January 1993Davis et al.
5182502January 1993Slotkowski et al.
5184956February 1993Langlais et al.
5185812February 1993Yamashita et al.
5187383February 1993Taccetta et al.
5189561February 1993Hong
5193000March 1993Lipton et al.
5193029March 1993Schofield et al.
5193894March 1993Lietar et al.
5204536April 1993Vardi
5204778April 1993Bechtel
5208701May 1993Maeda
5208750May 1993Kurami et al.
5212468May 1993Adell
5214408May 1993Asayama
5216408June 1993Shirakawa
5218414June 1993Kajiwara et al.
5220508June 1993Ninomiya et al.
5223814June 1993Suman
5223907June 1993Asayama
5225827July 1993Persson
5229941July 1993Hattori
5230400July 1993Kakinami et al.
5231379July 1993Wood et al.
5233527August 1993Shinnosuke
5234070August 1993Noah et al.
5235178August 1993Hegyi
5237249August 1993Levers
5243524September 1993Ishida et al.
5245422September 1993Borcherts et al.
5246193September 1993Faidley
5249126September 1993Hattori
5249128September 1993Markandey et al.
5249157September 1993Taylor
5251680October 1993Miezawa et al.
5253050October 1993Karasudani
5253109October 1993O'Farrell et al.
5265172November 1993Markandey et al.
5266873November 1993Arditi et al.
5267160November 1993Ito et al.
5276389January 1994Levers
5285060February 1994Larson et al.
5289182February 1994Brillard et al.
5289321February 1994Secor
5291424March 1994Asayama et al.
5293162March 1994Bachalo
5298732March 1994Chen
5301115April 1994Nouso et al.
5302956April 1994Asbury et al.
5304980April 1994Maekawa
5305012April 1994Faris
5307136April 1994Saneyoshi
5307419April 1994Tsujino et al.
5309137May 1994Kajiwara
5313072May 1994Vachss
5318143June 1994Parker et al.
5321556June 1994Joe
5325096June 1994Pakett
5325386June 1994Jewell et al.
5327288July 1994Wellington et al.
5329206July 1994Slotkowski et al.
5331312July 1994Kudoh
5336980August 1994Levers
5341437August 1994Nakayama
5343206August 1994Ansaldi et al.
5345266September 1994Denyer
5347456September 1994Zhang et al.
5351044September 1994Mathur et al.
D351370October 1994Lawlor et al.
5355118October 1994Fukuhara
5359666October 1994Nakayama et al.
5367457November 1994Ishida et al.
5369590November 1994Karasudani
5371535December 1994Takizawa
5373911December 1994Yasui
5374852December 1994Parkes
5379196January 1995Kobayashi et al.
5379353January 1995Hasegawa et al.
5381338January 1995Wysocki et al.
5386285January 1995Asayama
5388048February 1995Yavnayi et al.
5394333February 1995Kao
5398041March 1995Hyatt
5406395April 1995Wilson et al.
5406414April 1995O'Farrell et al.
5408330April 1995Squicciarini
5408346April 1995Trissel et al.
5410346April 1995Saneyoshi et al.
5414257May 1995Stanton
5414439May 1995Groves et al.
5414461May 1995Kishi et al.
5414625May 1995Hattori
5416313May 1995Larson et al.
5416318May 1995Hegyi
5416478May 1995Morinaga
5416711May 1995Gran et al.
5424952June 1995Asayama
5426294June 1995Kobayashi et al.
5430431July 1995Nelson
5430450July 1995Holmes
5434407July 1995Bauer et al.
5434927July 1995Brady et al.
5436839July 1995Dausch et al.
5440428August 1995Hegg et al.
5444478August 1995Lelong et al.
5448180September 1995Kienzler et al.
5450057September 1995Watanabe
5451822September 1995Bechtel et al.
5457493October 1995Leddy et al.
5459660October 1995Berra
5461357October 1995Yoshioka et al.
5461361October 1995Moore
5465079November 1995Bouchard et al.
5467284November 1995Yoshioka et al.
5469298November 1995Suman et al.
5471515November 1995Fossum et al.
5473515December 1995Liu
5475366December 1995Van Lente et al.
5475494December 1995Nishida et al.
5481257January 1996Brubaker et al.
5482133January 1996Iwata et al.
5483060January 1996Sugiura et al.
5483168January 1996Reid
5483453January 1996Uemura et al.
5487116January 1996Nakano et al.
5488496January 1996Pine
5493269February 1996Durley et al.
5493392February 1996Blackmon et al.
5498866March 1996Bendicks et al.
5500766March 1996Stonecypher
5508592April 1996Lapatovich et al.
5510983April 1996Iino
5515448May 1996Nishitani
5521633May 1996Nakajima et al.
5528698June 1996Kamei et al.
5529138June 1996Shaw et al.
5530240June 1996Larson et al.
5530330June 1996Baiden et al.
5530420June 1996Tsuchiya et al.
5530771June 1996Maekawa
5535144July 1996Kise
5535314July 1996Alves et al.
5537003July 1996Bechtel et al.
5539397July 1996Asanuma et al.
5541590July 1996Nishio
5545960August 1996Ishikawa
5550677August 1996Schofield et al.
5555136September 1996Waldmann et al.
5555312September 1996Shima et al.
5555503September 1996Kyrtsos et al.
5555555September 1996Sato et al.
5558123September 1996Castel et al.
5559695September 1996Daily
5562336October 1996Gotou et al.
5566224October 1996ul Azam et al.
5568027October 1996Teder
5568316October 1996Schrenck et al.
5572315November 1996Krell
5574443November 1996Hsieh
5576687November 1996Blank et al.
5581464December 1996Woll et al.
5582383December 1996Mertens et al.
5588123December 1996Loibl
5594222January 1997Caldwell
5596319January 1997Spry et al.
5596382January 1997Bamford
5598164January 1997Reppas et al.
5602457February 1997Anderson et al.
5612686March 1997Takano et al.
5612883March 1997Shaffer et al.
5614788March 1997Mullins
5614885March 1997Van Lente et al.
5615857April 1997Hook
5619370April 1997Guinosso
5627586May 1997Yamasaki
5633944May 1997Guibert et al.
5634709June 1997Iwama
5638116June 1997Shimoura et al.
5642299June 1997Hardin et al.
5646612July 1997Byon
5648835July 1997Uzawa
5650944July 1997Kise
5660454August 1997Mori et al.
5661303August 1997Teder
5666028September 1997Bechtel et al.
5667896September 1997Carter et al.
5668663September 1997Varaprasad et al.
5670935September 1997Schofield et al.
5673019September 1997Dantoni
5675489October 1997Pomerleau
5676484October 1997Chamberlin et al.
5677851October 1997Kingdon et al.
5677979October 1997Squicciarini et al.
5680263October 1997Zimmermann et al.
D388107December 1997Huckins
5699044December 1997Van Lente et al.
5699057December 1997Ikeda et al.
5699149December 1997Kuroda et al.
5706355January 1998Raboisson et al.
5707129January 1998Kobayashi
5708410January 1998Blank et al.
5710633January 1998Klappenbach et al.
5715093February 1998Schierbeek et al.
5719551February 1998Flick
5724187March 1998Varaprasad et al.
5724316March 1998Brunts
5737226April 1998Olson et al.
5757949May 1998Kinoshita et al.
5760826June 1998Nayer
5760828June 1998Cortes
5760931June 1998Saburi et al.
5760962June 1998Schofield et al.
5761094June 1998Olson et al.
5764139June 1998Nojima et al.
5765116June 1998Wilson-Jones et al.
5765940June 1998Levy et al.
5781105July 1998Bitar et al.
5781437July 1998Wiemer et al.
5786772July 1998Schofield et al.
5790403August 1998Nakayama
5790973August 1998Blaker et al.
5793308August 1998Rosinski et al.
5793420August 1998Schmidt
5796094August 1998Schofield et al.
5798575August 1998O'Farrell et al.
5804719September 1998Didelot et al.
5808589September 1998Fergason
5811888September 1998Hsieh
5820097October 1998Spooner
5835255November 1998Miles
5835613November 1998Breed et al.
5835614November 1998Aoyama et al.
5837994November 1998Stam et al.
5841126November 1998Fossum et al.
5844505December 1998Van Ryzin
5844682December 1998Kiyomoto et al.
5845000December 1998Breed et al.
5847755December 1998Wixson et al.
5848802December 1998Breed et al.
5850176December 1998Kinoshita et al.
5850254December 1998Takano et al.
5867591February 1999Onda
5877707March 1999Kowalick
5877897March 1999Schofield et al.
5878370March 1999Olson
5883193March 1999Karim
5883684March 1999Millikan et al.
5883739March 1999Ashihara et al.
5884212March 1999Lion
5890021March 1999Onoda
5890083March 1999Franke et al.
5896085April 1999Mori et al.
5899956May 1999Chan
5904725May 1999Iisaka et al.
5905457May 1999Rashid
5912534June 1999Benedict
5914815June 1999Bos
5920367July 1999Kajimoto et al.
5922036July 1999Yasui
5923027July 1999Stam et al.
5929784July 1999Kawaziri et al.
5929786July 1999Schofield et al.
5938320August 1999Crandall
5938810August 1999DeVries, Jr. et al.
5940120August 1999Frankhouse et al.
5942853August 1999Piscart
5949331September 1999Schofield et al.
5955941September 1999Pruksch et al.
5956181September 1999Lin
5959367September 1999O'Farrell et al.
5959555September 1999Furuta
5961571October 1999Gorr
5963247October 1999Banitt
5964822October 1999Alland et al.
5971552October 1999O'Farrell et al.
5982288November 1999Sawatari et al.
5986796November 1999Miles
5990469November 1999Bechtel et al.
5990649November 1999Nagao et al.
5991427November 1999Kakinami et al.
6001486December 1999Varaprasad et al.
6009336December 1999Harris et al.
6020704February 2000Buschur
6028537February 2000Suman et al.
6031484February 2000Bullinger
6037860March 2000Zander et al.
6037975March 2000Aoyama
6049171April 2000Stam et al.
6052124April 2000Stein et al.
6057754May 2000Kinoshita et al.
6066933May 2000Ponziana
6084519July 2000Coulling et al.
6087953July 2000DeLine et al.
6091833July 2000Yasui et al.
6094198July 2000Shashua
6097023August 2000Schofield et al.
6097024August 2000Stam et al.
6100811August 2000Hsu et al.
6107939August 2000Sorden
6116743September 2000Hoek
6122597September 2000Saneyoshi et al.
6124647September 2000Marcus et al.
6124886September 2000DeLine et al.
6139172October 2000Bos et al.
6140980October 2000Spitzer et al.
6144022November 2000Tenenbaum et al.
6144158November 2000Beam
6150014November 2000Chu et al.
6150930November 2000Cooper
6151065November 2000Steed et al.
6151539November 2000Bergholz et al.
6158655December 2000DeVries, Jr. et al.
6166628December 2000Andreas
6170955January 2001Campbell et al.
6172613January 2001DeLine et al.
6175164January 2001O'Farrell et al.
6175300January 2001Kendrick
6176590January 2001Prevost et al.
6188939February 2001Morgan et al.
6198409March 2001Schofield et al.
6201642March 2001Bos
6211907April 2001Scaman et al.
6218934April 2001Regan
6219444April 2001Shashua et al.
6222447April 2001Schofield et al.
6222460April 2001DeLine et al.
6226061May 2001Tagusa
6229319May 2001Johnson
6243003June 2001DeLine et al.
6247819June 2001Turnbull et al.
6250148June 2001Lynam
6259412July 2001Duroux
6259423July 2001Tokito et al.
6266082July 2001Yonezawa et al.
6266442July 2001Laumeyer et al.
6278377August 2001DeLine et al.
6281804August 2001Haller et al.
6285393September 2001Shimoura et al.
6285778September 2001Nakajima et al.
6291905September 2001Drummond et al.
6291906September 2001Marcus et al.
6292752September 2001Franke et al.
6294989September 2001Schofield et al.
6297781October 2001Turnbull et al.
6302545October 2001Schofield et al.
6310611October 2001Caldwell
6311119October 2001Sawamoto et al.
6313454November 2001Bos et al.
6315421November 2001Apfelbeck et al.
6317057November 2001Lee
6318870November 2001Spooner et al.
6320176November 2001Schofield et al.
6320282November 2001Caldwell
6324450November 2001Iwama
6326613December 2001Heslin et al.
6329925December 2001Skiver et al.
6333759December 2001Mazzilli
6341523January 2002Lynam
6353392March 2002Schofield et al.
6359392March 2002He
6362729March 2002Hellmann et al.
6366213April 2002DeLine et al.
6366236April 2002Farmer et al.
6370329April 2002Teuchert
6388565May 2002Bernhard et al.
6388580May 2002Graham
6389340May 2002Rayner
6392218May 2002Kuehnle
6396397May 2002Bos et al.
6396408May 2002Drummond et al.
6411204June 2002Bloomfield et al.
6411328June 2002Franke et al.
6420975July 2002DeLine et al.
6424273July 2002Gutta et al.
6428172August 2002Hutzel et al.
6429594August 2002Stam et al.
6430303August 2002Naoi et al.
6433676August 2002DeLine et al.
6433817August 2002Guerra
6441748August 2002Takagi et al.
6442465August 2002Breed et al.
6445287September 2002Schofield et al.
6445809September 2002Sasaki et al.
6449540September 2002Raynar
6452148September 2002Bendicks et al.
6466136October 2002DeLine et al.
6466684October 2002Sasaki et al.
6469739October 2002Bechtel et al.
6472977October 2002Poechmueller
6472979October 2002Schofield et al.
6477260November 2002Shimomura
6477464November 2002McCarthy et al.
6483438November 2002DeLine et al.
6485155November 2002Duroux et al.
6497503December 2002Dassanayake et al.
6498620December 2002Schofield et al.
6509832January 2003Bauer et al.
6513252February 2003Schierbeek et al.
6515378February 2003Drummond et al.
6516272February 2003Lin
6516664February 2003Lynam
6523964February 2003Schofield et al.
6534884March 2003Marcus et al.
6535242March 2003Strumolo et al.
6539306March 2003Turnbull
6540193April 2003DeLine
6547133April 2003DeVries, Jr. et al.
6553130April 2003Lemelson et al.
6559435May 2003Schofield et al.
6570998May 2003Ohtsuka et al.
6574033June 2003Chui et al.
6577334June 2003Kawai et al.
6578017June 2003Ebersole et al.
6587573July 2003Stam et al.
6587968July 2003Leyva
6589625July 2003Kothari et al.
6593011July 2003Liu et al.
6593565July 2003Heslin et al.
6593698July 2003Stam et al.
6593960July 2003Sugimoto et al.
6594583July 2003Ogura et al.
6611202August 2003Schofield et al.
6611610August 2003Stam et al.
6614579September 2003Roberts et al.
6617564September 2003Ockerse et al.
6627918September 2003Getz et al.
6631316October 2003Stam et al.
6631994October 2003Suzuki et al.
6636258October 2003Strumolo
6648477November 2003Hutzel et al.
6650233November 2003Deline et al.
6650455November 2003Miles
6653614November 2003Stam et al.
6672731January 2004Schnell et al.
6674562January 2004Miles
6674878January 2004Retterath et al.
6678056January 2004Downs
6678590January 2004Burchfiel
6678614January 2004McCarthy et al.
6680792January 2004Miles
6681163January 2004Stam et al.
6690268February 2004Schofield et al.
6700605March 2004Toyoda et al.
6703925March 2004Steffel
6704621March 2004Stein et al.
6710908March 2004Miles et al.
6711474March 2004Treyz et al.
6714331March 2004Lewis et al.
6717524April 2004DeLine et al.
6717610April 2004Bos et al.
6728393April 2004Stam et al.
6728623April 2004Takenaga et al.
6735506May 2004Breed et al.
6738088May 2004Uskolovsky et al.
6741186May 2004Ross
6741377May 2004Miles
6744353June 2004Sjonell
6754367June 2004Ito et al.
6757109June 2004Bos
6762867July 2004Lippert et al.
6764210July 2004Akiyama
6765480July 2004Tseng
6774988August 2004Stam et al.
6784828August 2004Delcheccolo et al.
6794119September 2004Miles
6795221September 2004Urey
6801127October 2004Mizusawa
6801244October 2004Takeda et al.
6802617October 2004Schofield et al.
6806452October 2004Bos et al.
6807287October 2004Hermans
6811330November 2004Tozawa
6812463November 2004Okada
6813545November 2004Stromme
6819231November 2004Berberich et al.
6819779November 2004Nichani
6822563November 2004Bos et al.
6823241November 2004Shirato et al.
6823261November 2004Sekiguchi
6824281November 2004Schofield et al.
6831261December 2004Schofield et al.
6838980January 2005Gloger et al.
6842189January 2005Park
6847487January 2005Burgner
6850629February 2005Jeon
6853738February 2005Nishigaki et al.
6859148February 2005Miller et al.
6861809March 2005Stam
6864930March 2005Matsushita et al.
6873253March 2005Veziris
6882287April 2005Schofield
6888447May 2005Hori et al.
6889161May 2005Winner et al.
6891563May 2005Schofield et al.
6898518May 2005Padmanabhan
6906620June 2005Nakai et al.
6906639June 2005Lemelson et al.
6909753June 2005Meehan et al.
6914521July 2005Rothkop
6928180August 2005Stam et al.
6932669August 2005Lee et al.
6933837August 2005Gunderson et al.
6940423September 2005Takagi et al.
6946978September 2005Schofield
6950035September 2005Tanaka et al.
6953253October 2005Schofield et al.
6956469October 2005Hirvonen et al.
6959994November 2005Fujikawa et al.
6961178November 2005Sugino et al.
6961661November 2005Sekiguchi
6963661November 2005Hattori et al.
6967569November 2005Weber et al.
6968736November 2005Lynam
6975775December 2005Rykowski et al.
6980092December 2005Turnbull et al.
6989736January 2006Berberich et al.
6990397January 2006Albou et al.
6995687February 2006Lang et al.
7004593February 2006Weller et al.
7004606February 2006Schofield
7005974February 2006McMahon et al.
7012507March 2006DeLine et al.
7012727March 2006Hutzel et al.
7023331April 2006Kodama
7027387April 2006Reinold et al.
7027615April 2006Chen
7030738April 2006Ishii
7030775April 2006Sekiguchi
7030778April 2006Ra
7038577May 2006Pawlicki et al.
7046448May 2006Burgner
7057505June 2006Iwamoto
7057681June 2006Hinata et al.
7062300June 2006Kim
7065432June 2006Moisel et al.
7068289June 2006Satoh et al.
7068844June 2006Javidi et al.
7085633August 2006Nishira et al.
7085637August 2006Breed et al.
7091837August 2006Nakai et al.
7092548August 2006Laumeyer et al.
7095432August 2006Nakayama et al.
7106213September 2006White
7110021September 2006Nobori et al.
7110156September 2006Lawlor et al.
7113867September 2006Stein
7116246October 2006Winter et al.
7121028October 2006Shoen et al.
7123168October 2006Schofield
7133661November 2006Hatae et al.
7149613December 2006Stam et al.
7151996December 2006Stein
7167796January 2007Taylor et al.
7171027January 2007Satoh
7184585February 2007Hamza et al.
7187498March 2007Bengoechea et al.
7188963March 2007Schofield et al.
7195381March 2007Lynam et al.
7202776April 2007Breed
7202987April 2007Varaprasad et al.
7205904April 2007Schofield
7221363May 2007Roberts et al.
7224324May 2007Quist et al.
7227459June 2007Bos et al.
7227611June 2007Hull et al.
7235918June 2007McCullough et al.
7248283July 2007Takagi et al.
7248344July 2007Morcom
7249860July 2007Kulas et al.
7253723August 2007Lindahl et al.
7255451August 2007McCabe et al.
7271951September 2007Weber et al.
7304661December 2007Ishikura
7311406December 2007Schofield et al.
7325934February 2008Schofield et al.
7325935February 2008Schofield et al.
7337055February 2008Matsumoto et al.
7338177March 2008Lynam
7339149March 2008Schofield et al.
7344261March 2008Schofield et al.
7355524April 2008Schofield
7360932April 2008Uken et al.
7362883April 2008Otsuka et al.
7370983May 2008DeWind et al.
7375803May 2008Bamji
7380948June 2008Schofield et al.
7388182June 2008Schofield et al.
7402786July 2008Schofield et al.
7403659July 2008Das et al.
7420756September 2008Lynam
7423248September 2008Schofield et al.
7423821September 2008Bechtel et al.
7425076September 2008Schofield et al.
7429998September 2008Kawauchi et al.
7432248October 2008Roberts et al.
7432967October 2008Bechtel et al.
7446924November 2008Schofield et al.
7459664December 2008Schofield et al.
7460007December 2008Schofield et al.
7463138December 2008Pawlicki et al.
7468652December 2008DeLine et al.
7474963January 2009Taylor et al.
7480149January 2009DeWard et al.
7489374February 2009Utsumi et al.
7495719February 2009Adachi et al.
7525604April 2009Xue
7526103April 2009Schofield et al.
7533998May 2009Schofield et al.
7541743June 2009Salmeen et al.
7543946June 2009Ockerse et al.
7545429June 2009Travis
7548291June 2009Lee et al.
7551103June 2009Schofield
7561181July 2009Schofield et al.
7565006July 2009Stam et al.
7566639July 2009Kohda
7566851July 2009Stein et al.
7567291July 2009Bechtel et al.
7605856October 2009Imoto
7613327November 2009Stam et al.
7616781November 2009Schofield et al.
7619508November 2009Lynam et al.
7629996December 2009Rademacher et al.
7633383December 2009Dunsmoir et al.
7639149December 2009Katoh
7650030January 2010Shan et al.
7653215January 2010Stam
7655894February 2010Schofield et al.
7663798February 2010Tonar et al.
7676087March 2010Dhua et al.
7679498March 2010Pawlicki et al.
7683326March 2010Stam et al.
7702133April 2010Muramatsu et al.
7719408May 2010DeWard et al.
7720580May 2010Higgins-Luthman
7724434May 2010Cross et al.
7731403June 2010Lynam et al.
7742864June 2010Sekiguchi
7786898August 2010Stein et al.
7791694September 2010Molsen et al.
7792329September 2010Schofield et al.
7825600November 2010Stam et al.
7842154November 2010Lynam
7843451November 2010Lafon
7854514December 2010Conner et al.
7855755December 2010Weller et al.
7855778December 2010Yung et al.
7859565December 2010Schofield et al.
7873187January 2011Schofield et al.
7877175January 2011Higgins-Luthman
7881496February 2011Camilleri et al.
7903324March 2011Kobayashi et al.
7903335March 2011Nieuwkerk et al.
7914187March 2011Higgins-Luthman et al.
7914188March 2011DeLine et al.
7930160April 2011Hosagrahara et al.
7949152May 2011Schofield et al.
7965357June 2011Van De Witte et al.
7972045July 2011Schofield
7991522August 2011Higgins-Luthman
7994462August 2011Schofield et al.
7995067August 2011Navon
8004392August 2011DeLine et al.
8017898September 2011Lu et al.
8027691September 2011Bernas et al.
8045760October 2011Stam et al.
8063759November 2011Bos et al.
8064643November 2011Stein et al.
8082101December 2011Stein et al.
8090153January 2012Schofield et al.
8094002January 2012Schofield et al.
8095310January 2012Taylor et al.
8098142January 2012Schofield et al.
8100568January 2012DeLine et al.
8116929February 2012Higgins-Luthman
8120652February 2012Bechtel et al.
8162518April 2012Schofield
8164628April 2012Stein et al.
8179437May 2012Schofield et al.
8184159May 2012Luo
8203440June 2012Schofield et al.
8203443June 2012Bos et al.
8222588July 2012Schofield et al.
8224031July 2012Saito
8233045July 2012Luo et al.
8254635August 2012Stein et al.
8288711October 2012Heslin et al.
8289142October 2012Pawlicki et al.
8289430October 2012Bechtel et al.
8300058October 2012Navon et al.
8305471November 2012Bechtel et al.
8308325November 2012Takayanazi et al.
8314689November 2012Schofield et al.
8324552December 2012Schofield et al.
8325028December 2012Schofield et al.
8325986December 2012Schofield et al.
8339526December 2012Minikey, Jr. et al.
8350683January 2013DeLine et al.
8362883January 2013Hale et al.
8378851February 2013Stein et al.
8386114February 2013Higgins-Luthman
8405726March 2013Schofield et al.
8414137April 2013Quinn et al.
8434919May 2013Schofield
8452055May 2013Stein et al.
8481910July 2013Schofield et al.
8481916July 2013Heslin et al.
8492698July 2013Schofield et al.
8508593August 2013Schofield et al.
8513590August 2013Heslin et al.
8531278September 2013DeWard et al.
8531279September 2013DeLine et al.
8534887September 2013DeLine et al.
8538205September 2013Sixsou et al.
8543330September 2013Taylor et al.
8553088October 2013Stein et al.
8593521November 2013Schofield et al.
8599001December 2013Schofield et al.
8629768January 2014Bos et al.
8636393January 2014Schofield
8637801January 2014Schofield et al.
8643724February 2014Schofield et al.
8656221February 2014Sixsou et al.
8665079March 2014Pawlicki et al.
8676491March 2014Taylor et al.
8686840April 2014Drummond et al.
8692659April 2014Schofield et al.
8818042August 2014Schofield et al.
9008369April 2015Schofield et al.
9171217October 2015Pawlicki et al.
9191634November 2015Schofield et al.
9428192August 2016Schofield et al.
9555803January 2017Pawlicki et al.
9834216December 2017Pawlicki et al.
2001/0002451May 2001Breed
2002/0003571January 2002Schofield et al.
2002/0005778January 2002Breed
2002/0011611January 2002Huang et al.
2002/0029103March 2002Breed et al.
2002/0060522May 2002Stam et al.
2002/0080235June 2002Jeon
2002/0113873August 2002Williams
2002/0116106August 2002Breed et al.
2002/0126002September 2002Patchell
2002/0126875September 2002Naoi et al.
2002/0135468September 2002Bos et al.
2003/0040864February 2003Stein
2003/0070741April 2003Rosenberg et al.
2003/0103142June 2003Hitomi et al.
2003/0122930July 2003Schofield
2003/0125855July 2003Breed et al.
2003/0128106July 2003Ross
2003/0137586July 2003Lewellen
2003/0191568October 2003Breed
2003/0202683October 2003Ma et al.
2003/0209893November 2003Breed et al.
2003/0222982December 2003Hamdan et al.
2004/0016870January 2004Pawlicki et al.
2004/0021947February 2004Schofield
2004/0022416February 2004Lemelson
2004/0086153May 2004Tsai et al.
2004/0096082May 2004Nakai et al.
2004/0146184July 2004Hamza et al.
2004/0148063July 2004Patchell
2004/0164228August 2004Fogg et al.
2004/0200948October 2004Bos et al.
2005/0036325February 2005Furusawa et al.
2005/0073853April 2005Stam
2005/0131607June 2005Breed
2005/0219852October 2005Stam et al.
2005/0226490October 2005Phillips et al.
2005/0237385October 2005Kosaka et al.
2006/0018511January 2006Stam et al.
2006/0018512January 2006Stam et al.
2006/0050018March 2006Hutzel et al.
2006/0091813May 2006Stam et al.
2006/0095175May 2006deWaal et al.
2006/0103727May 2006Tseng
2006/0250224November 2006Steffel et al.
2006/0250501November 2006Wildmann et al.
2007/0024724February 2007Stein et al.
2007/0104476May 2007Yasutomi et al.
2007/0109406May 2007Schofield et al.
2007/0115357May 2007Stein et al.
2007/0120657May 2007Schofield et al.
2007/0154063July 2007Breed
2007/0154068July 2007Stein et al.
2007/0193811August 2007Breed et al.
2007/0221822September 2007Stein et al.
2007/0229238October 2007Boyles et al.
2007/0230792October 2007Shashua et al.
2007/0242339October 2007Bradley
2008/0036576February 2008Stein et al.
2008/0043099February 2008Stein et al.
2008/0137908June 2008Stein
2008/0147321June 2008Howard et al.
2008/0231710September 2008Asari et al.
2008/0234899September 2008Breed et al.
2008/0239393October 2008Navon
2008/0266396October 2008Stein
2009/0052003February 2009Schofield et al.
2009/0066065March 2009Breed et al.
2009/0113509April 2009Tseng et al.
2009/0143986June 2009Stein et al.
2009/0182690July 2009Stein
2009/0190015July 2009Bechtel et al.
2009/0201137August 2009Weller et al.
2009/0243824October 2009Peterson et al.
2009/0256938October 2009Bechtel et al.
2009/0300629December 2009Navon et al.
2010/0125717May 2010Navon
2010/0172547July 2010Akutsu
2011/0018700January 2011Stein et al.
2011/0219217September 2011Sixsou et al.
2011/0280495November 2011Sixsou et al.
2011/0307684December 2011Krenin et al.
2012/0002053January 2012Stein et al.
2012/0045112February 2012Lundblad et al.
2012/0056735March 2012Stein et al.
2012/0069185March 2012Stein
2012/0105639May 2012Stein et al.
2012/0140076June 2012Rosenbaum et al.
2012/0200707August 2012Stein et al.
2012/0212593August 2012Na'aman et al.
2012/0233841September 2012Stein
2012/0314071December 2012Rosenbaum et al.
2013/0135444May 2013Stein et al.
2013/0141580June 2013Stein et al.
2013/0147957June 2013Stein
2013/0169536July 2013Wexler et al.
2013/0271584October 2013Wexler et al.
2013/0308828November 2013Stein et al.
2014/0015976January 2014DeLine et al.
2014/0033203January 2014Dogon et al.
2014/0049648February 2014Stein et al.
2014/0082307March 2014Kreinin et al.
2014/0093132April 2014Stein et al.
2014/0122551May 2014Dogon et al.
2014/0125799May 2014Bos et al.
2014/0156140June 2014Stein et al.
2014/0160244June 2014Berberian et al.
2014/0161323June 2014Livyatan et al.
2014/0198184July 2014Stein et al.

Зарубежные патентные документы

519193Aug 2011AT
1008142Jan 1996BE
1101522May 1981CA
2392578May 2001CA
2392652May 2001CA
644315Jul 1984CH
2074262Apr 1991CN
2185701Dec 1994CN
1104741Jul 1995CN
2204254Aug 1995CN
1194056Sep 1998CN
1235913Nov 1999CN
1383032Dec 2002CN
102193852Sep 2011CN
102542256Jul 2012CN
1152627Aug 1963DE
1182971Dec 1964DE
1190413Apr 1965DE
1196598Jul 1965DE
1214174Apr 1966DE
2064839Jul 1972DE
3004247Aug 1981DE
3040555May 1982DE
3101855Aug 1982DE
3240498May 1984DE
3248511Jul 1984DE
3433671Mar 1985DE
3515116Oct 1986DE
3528220Feb 1987DE
3535588Apr 1987DE
3601388Jul 1987DE
3637165May 1988DE
3636946Jun 1988DE
3642196Jun 1988DE
3734066Apr 1989DE
3737395May 1989DE
3838365Jun 1989DE
3833022Apr 1990DE
3839512May 1990DE
3839513May 1990DE
3937576May 1990DE
3840425Jun 1990DE
3844364Jul 1990DE
9010196Oct 1990DE
4015927Nov 1990DE
3932216Apr 1991DE
4007646Sep 1991DE
4107965Sep 1991DE
4111993Oct 1991DE
4015959Nov 1991DE
4116255Dec 1991DE
4023952Feb 1992DE
4130010Mar 1992DE
4032927Apr 1992DE
4133882Apr 1992DE
4035956May 1992DE
4122531Jan 1993DE
4124654Jan 1993DE
4137551Mar 1993DE
4136427May 1993DE
4300941Jul 1993DE
4206142Sep 1993DE
4214223Nov 1993DE
4231137Feb 1994DE
4328304Mar 1994DE
4328902Mar 1994DE
4332612Apr 1994DE
4238599Jun 1994DE
4337756Jun 1994DE
4344485Jun 1994DE
4304005Aug 1994DE
4332836Sep 1994DE
4407082Sep 1994DE
4407757Sep 1994DE
4411179Oct 1994DE
4412669Oct 1994DE
4418122Dec 1994DE
4423966Jan 1995DE
4336288Mar 1995DE
4428069Mar 1995DE
4434698Mar 1995DE
4341409Jun 1995DE
4446452Jun 1995DE
69107283Jul 1995DE
4403937Aug 1995DE
19505487Sep 1995DE
19518978Nov 1995DE
4480341Mar 1996DE
069302975Dec 1996DE
29703084Jun 1997DE
29805142Jun 1998DE
19755008Jul 1999DE
19829162Jan 2000DE
10237554Mar 2004DE
000010251949May 2004DE
19530617Feb 2009DE
0048492Mar 1982EP
0049722Apr 1982EP
0072406Feb 1983EP
0176615Apr 1986EP
0202460Nov 1986EP
0169734Oct 1989EP
0340735Nov 1989EP
0341985Nov 1989EP
0348691Jan 1990EP
0353200Jan 1990EP
0354561Feb 1990EP
0360880Apr 1990EP
0361914Apr 1990EP
0387817Sep 1990EP
0527665Feb 1991EP
0426503May 1991EP
0433538Jun 1991EP
0450553Oct 1991EP
0454516Oct 1991EP
0455524Nov 1991EP
0459433Dec 1991EP
473866Mar 1992EP
0477986Apr 1992EP
0479271Apr 1992EP
0487100May 1992EP
0487465May 1992EP
0492591Jul 1992EP
0495508Jul 1992EP
0496411Jul 1992EP
0501345Sep 1992EP
0505237Sep 1992EP
0513476Nov 1992EP
0514343Nov 1992EP
529346Mar 1993EP
0532379Mar 1993EP
0533508Mar 1993EP
0550397Jul 1993EP
0558027Sep 1993EP
0564858Oct 1993EP
0567059Oct 1993EP
0582236Feb 1994EP
0586857Mar 1994EP
0588815Mar 1994EP
0590588Apr 1994EP
0591743Apr 1994EP
0602962Jun 1994EP
0605045Jul 1994EP
0606586Jul 1994EP
0617296Sep 1994EP
0626654Nov 1994EP
0640903Mar 1995EP
0642950Mar 1995EP
0654392May 1995EP
0667708Aug 1995EP
0677428Oct 1995EP
0686865Dec 1995EP
0687594Dec 1995EP
0697641Feb 1996EP
733252Sep 1996EP
0756968Feb 1997EP
0788947Aug 1997EP
0487332Oct 1997EP
0874331Oct 1998EP
0889801Jan 1999EP
0893308Jan 1999EP
0899157Mar 1999EP
0913751May 1999EP
0949818Oct 1999EP
1022903Jul 2000EP
1257971Nov 2000EP
1058220Dec 2000EP
1065642Jan 2001EP
1074430Feb 2001EP
1115250Jul 2001EP
0830267Dec 2001EP
1170173Jan 2002EP
1236126Sep 2002EP
0860325Nov 2002EP
1359557Nov 2003EP
1727089Nov 2006EP
1748644Jan 2007EP
1754179Feb 2007EP
1790541May 2007EP
1806595Jul 2007EP
1837803Sep 2007EP
1887492Feb 2008EP
1741079May 2008EP
1930863Jun 2008EP
1978484Oct 2008EP
2068269Jun 2009EP
2101258Sep 2009EP
2131278Dec 2009EP
2150437Feb 2010EP
2172873Apr 2010EP
2187316May 2010EP
2365441Sep 2011EP
2377094Oct 2011EP
2383679Nov 2011EP
2383713Nov 2011EP
2395472Dec 2011EP
2431917Mar 2012EP
2448251May 2012EP
2463843Jun 2012EP
2602741Jun 2013EP
2605185Jun 2013EP
2629242Aug 2013EP
2674323Dec 2013EP
2690548Jan 2014EP
2709020Mar 2014EP
2728462May 2014EP
2250218Apr 2006ES
2610401Aug 1988FR
2641237Jul 1990FR
2646383Nov 1990FR
2674201Sep 1992FR
2674354Sep 1992FR
2687000Aug 1993FR
2706211Dec 1994FR
2721872Jan 1996FR
914827Jan 1963GB
1000265Aug 1965GB
1008411Oct 1965GB
1054064Jan 1967GB
1098608Jan 1968GB
1098610Jan 1968GB
1106339Mar 1968GB
1178416Jan 1970GB
1197710Jul 1970GB
2210835Jun 1989GB
2233530Jan 1991GB
2255649Nov 1992GB
2261339May 1993GB
2262829Jun 1993GB
9310728Jul 1993GB
2267341Dec 1993GB
2271139Apr 1994GB
2275452Aug 1994GB
2280810Feb 1995GB
2289332Nov 1995GB
970014Jul 1998IE
S5539843Mar 1980JP
55156901Dec 1980JP
S5685110Jul 1981JP
S5871230Apr 1983JP
58110334Jun 1983JP
58122421Jul 1983JP
59114139Jul 1984JP
59127200Jul 1984JP
S6047737Mar 1985JP
6079889May 1985JP
6080953May 1985JP
S6078312May 1985JP
S60206746Oct 1985JP
60240545Nov 1985JP
S60219133Nov 1985JP
S60255537Dec 1985JP
S6141929Feb 1986JP
S6185238Apr 1986JP
S61105245May 1986JP
S61191937Aug 1986JP
61260217Nov 1986JP
S61285151Dec 1986JP
S61285152Dec 1986JP
62001652Jan 1987JP
S6221010Jan 1987JP
S6226141Feb 1987JP
62080143Apr 1987JP
S6216073Apr 1987JP
6272245May 1987JP
S62115600May 1987JP
62131837Jun 1987JP
S62253543Nov 1987JP
S62253546Nov 1987JP
S62287164Dec 1987JP
63011446Jan 1988JP
63258236Oct 1988JP
63258237Oct 1988JP
63192788Dec 1988JP
6414700Jan 1989JP
01123587May 1989JP
H1168538Jul 1989JP
01242917Sep 1989JP
H01233129Sep 1989JP
H01265400Oct 1989JP
H01275237Nov 1989JP
H0268237Mar 1990JP
02190978Jul 1990JP
H236417Aug 1990JP
H02212232Aug 1990JP
H2117935Sep 1990JP
H0314739Jan 1991JP
H0374231Mar 1991JP
03099952Apr 1991JP
03266739May 1991JP
H03246413Nov 1991JP
H05137144Nov 1991JP
03282707Dec 1991JP
03282709Dec 1991JP
03286399Dec 1991JP
H03273953Dec 1991JP
H042909Jan 1992JP
H0410200Jan 1992JP
04114587Apr 1992JP
04127280Apr 1992JP
04137014May 1992JP
H04137112May 1992JP
H04194827Jul 1992JP
04239400Aug 1992JP
04242391Aug 1992JP
H04238219Aug 1992JP
04250786Sep 1992JP
04291405Oct 1992JP
H04303047Oct 1992JP
H0516722Jan 1993JP
H0538977Feb 1993JP
H06229759Feb 1993JP
0577657Mar 1993JP
05050883Mar 1993JP
H06332370May 1993JP
H05155287Jun 1993JP
05189694Jul 1993JP
H05172638Jul 1993JP
05213113Aug 1993JP
H05201298Aug 1993JP
05244596Sep 1993JP
H05229383Sep 1993JP
05298594Nov 1993JP
05313736Nov 1993JP
H05297141Nov 1993JP
06048247Feb 1994JP
H0640286Feb 1994JP
06076200Mar 1994JP
H0672234Mar 1994JP
06107035Apr 1994JP
06113215Apr 1994JP
06117924Apr 1994JP
06150198May 1994JP
H06162398Jun 1994JP
H06174845Jun 1994JP
H06191344Jul 1994JP
06215291Aug 1994JP
6227318Aug 1994JP
06230115Aug 1994JP
H06229739Aug 1994JP
06247246Sep 1994JP
6266825Sep 1994JP
06267304Sep 1994JP
06270733Sep 1994JP
06274626Sep 1994JP
06276524Sep 1994JP
H06262963Sep 1994JP
H06267303Sep 1994JP
H06275104Sep 1994JP
06295601Oct 1994JP
H06289138Oct 1994JP
H06293236Oct 1994JP
05093981Nov 1994JP
06310740Nov 1994JP
06321007Nov 1994JP
H06321010Nov 1994JP
H06324144Nov 1994JP
06337938Dec 1994JP
06341821Dec 1994JP
07002021Jan 1995JP
07004170Jan 1995JP
07025286Jan 1995JP
H072022Jan 1995JP
732936Feb 1995JP
07032935Feb 1995JP
07047878Feb 1995JP
07052706Feb 1995JP
H0737180Feb 1995JP
H0740782Feb 1995JP
H0746460Feb 1995JP
07069125Mar 1995JP
07078240Mar 1995JP
H0764632Mar 1995JP
H0771916Mar 1995JP
H07057200Mar 1995JP
H07078258Mar 1995JP
07105496Apr 1995JP
H07101291Apr 1995JP
H07105487Apr 1995JP
H07108873Apr 1995JP
H07108874Apr 1995JP
07125571May 1995JP
07137574May 1995JP
H07125570May 1995JP
H730149Jun 1995JP
H07141588Jun 1995JP
H07144577Jun 1995JP
07186818Jul 1995JP
07192192Jul 1995JP
06000927Aug 1995JP
07242147Sep 1995JP
H07239714Sep 1995JP
H07249128Sep 1995JP
H07280563Oct 1995JP
H07315122Dec 1995JP
H0840138Feb 1996JP
H0840140Feb 1996JP
H0843082Feb 1996JP
H0844999Feb 1996JP
H0850697Feb 1996JP
H08138036May 1996JP
08166221Jun 1996JP
08235484Sep 1996JP
H08320997Dec 1996JP
02630604Apr 1997JP
H0991596Apr 1997JP
09330415Dec 1997JP
10038562Feb 1998JP
10063985Mar 1998JP
H1090188Apr 1998JP
10134183May 1998JP
10171966Jun 1998JP
H10222792Aug 1998JP
10261189Sep 1998JP
11069211Mar 1999JP
11078737Mar 1999JP
H1178693Mar 1999JP
H1178717Mar 1999JP
H1123305Jul 1999JP
11250228Sep 1999JP
H11259634Sep 1999JP
11345392Dec 1999JP
2000016352Jan 2000JP
2000085474Mar 2000JP
2000113374Apr 2000JP
2000127849May 2000JP
2000207575Jul 2000JP
2000215299Aug 2000JP
2000305136Nov 2000JP
2000311289Nov 2000JP
2001001832Jan 2001JP
2001092970Apr 2001JP
2001180401Jul 2001JP
2001188988Jul 2001JP
2001297397Oct 2001JP
2001351107Dec 2001JP
2002022439Jan 2002JP
2002046506Feb 2002JP
2002074339Mar 2002JP
2002079895Mar 2002JP
2002084533Mar 2002JP
2002099908Apr 2002JP
2002109699Apr 2002JP
2002175534Jun 2002JP
2002211428Jul 2002JP
2002341432Nov 2002JP
2003030665Jan 2003JP
2003076987Mar 2003JP
2003083742Mar 2003JP
3395289Apr 2003JP
2003123058Apr 2003JP
2003150938May 2003JP
2003168197Jun 2003JP
2003178397Jun 2003JP
2003217099Jul 2003JP
2003248895Sep 2003JP
2003259361Sep 2003JP
2003281700Oct 2003JP
20041658Jan 2004JP
2004032460Jan 2004JP
2004146904May 2004JP
2004336613Nov 2004JP
2004355139Dec 2004JP
2005182158Jul 2005JP
2000883510000Mar 1995KR
1020010018981Oct 2002KR
1004124340000Mar 2004KR
336535Jul 1971SE
WO1986005147Sep 1986WO
WO1988009023Nov 1988WO
WO1990004528May 1990WO
WO1993000647Jan 1993WO
WO1993004556Mar 1993WO
WO1993010550May 1993WO
WO1993011631Jun 1993WO
WO1993021596Oct 1993WO
WO1994019212Sep 1994WO
WO1995018979Jul 1995WO
WO1995023082Aug 1995WO
WO1996002817Feb 1996WO
WO1996015921May 1996WO
WO1996018275Jun 1996WO
WO1996021581Jul 1996WO
WO1996034365Oct 1996WO
WO1996038319Dec 1996WO
WO1997001246Jan 1997WO
WO1997029926Aug 1997WO
WO1997035743Oct 1997WO
WO1997048134Dec 1997WO
WO1998010246Mar 1998WO
WO1998014974Apr 1998WO
WO1999043242Feb 1999WO
WO1999023828May 1999WO
WO1999059100Nov 1999WO
WO2000015462Mar 2000WO
WO2001026332Apr 2001WO
WO2001039018May 2001WO
WO2001039120May 2001WO
WO2001064481Sep 2001WO
WO2001070538Sep 2001WO
WO2001077763Oct 2001WO
WO2001080068Oct 2001WO
WO2001080353Oct 2001WO
WO2002071487Sep 2002WO
WO2003065084Aug 2003WO
WO2003093857Nov 2003WO
WO2004004320Jan 2004WO
WO2004005073Jan 2004WO
WO2005098751Oct 2005WO
WO2005098782Oct 2005WO
WO2008134715Nov 2008WO
WO2013121357Aug 2013WO

Другие источники


"All-seeing screens for tomorrow's cars", Southend Evening Echo, Oct. 4, 1991. cited by applicant .
"Final Report of the Working Group on Advanced Vehicle Control Systems (AVCS)" Mobility 2000, Mar. 1990. cited by applicant .
"Magic Eye on safety", Western Daily Press, Oct. 10, 1991. cited by applicant .
"On-screen technology aims at safer driving", Kent Evening Post Oct. 4, 1991. cited by applicant .
"Versatile LEDs Drive Machine vision in Automated Manufacture," http://www.digikey.ca/en/articles/techzone/2012/jan/versatileleds-drive-m- achine-vision-in-automated-manufacture. cited by applicant .
3M, "Automotive Rear View Mirror Button Repair System", Automotive Engineered Systems Division, Jun. 1996. cited by applicant .
Abshire et al., "Confession Session: Learning from Others Mistakes," 2011 IEEE International Symposium on Circuits and Systems (ISCAS), 2011. cited by applicant .
Achler et al., "Vehicle Wheel Detector using 2D Filter Banks," IEEE Intelligent Vehicles Symposium of Jun. 2004. cited by applicant .
Ackland et al., "Camera on a chip", Digest of Technical Papers of the 42nd Solid-State Circuits Conference (ISSCC), Paper TA 1.2, 1996. cited by applicant .
Alley, "Algorithms for automatic guided vehicle navigation and guidance based on Linear Image Array sensor data", Masters or PhD. Thesis, Dec. 31, 1988. cited by applicant .
Altan, "LaneTrak: a vision-based automatic vehicle steering system", Applications in Optical Science and Engineering. International Society for Optics and Photonics, 1993, Abstract. cited by applicant .
Amidi, "Integrated Mobile Robot Control", M.S. Thesis, Carnegie Mellon University, May 1990. cited by applicant .
An et al., "Aspects of Neural Networks in Intelligent Collision Avoidance Systems for Prometheus", JFIT 93, pp. 129-135, Mar. 1993. cited by applicant .
Arain et al., "Action planning for the collision avoidance system using neural networks", Intelligent Vehicle Symposium, Tokyo, Japan, Jul. 1993. cited by applicant .
Arain et al., "Application of Neural Networks for Traffic Scenario Identification", 4th Prometheus Workshop, University of Compiegne, Paris, France, pp. 102-111, Sep. 1990. cited by applicant .
Ashley, "Smart Cars and Automated Highways", Mechanical Engineering 120.5 (1998): 58, Abstract. cited by applicant .
Aufrere et al., "A model-driven approach for real-time road recognition", Machine Vision and Applications 13, 2001, pp. 95-107. cited by applicant .
Auty et al., "Image acquisition system for traffic monitoring applications" IS&T/SPIE's Symposium on Electronic Imaging: Science & Technology. International Society for Optics and Photonics, Mar. 14, 1995. cited by applicant .
Aw et al., "A 128 x 128 Pixel Standard-CMOS Image Sensor with Electronic Shutter," IEEE Journal of Solid-State Circuits, vol. 31, No. 12, Dec. 1996. cited by applicant .
Ballard et al., "Computer Vision", 1982, p. 88-89, sect. 3.4.1. cited by applicant .
Barron et al., "The role of electronic controls for future automotive mechatronic systems", IEEE/ASME Transactions on mechatronics 1.1, Mar. 1996, pp. 80-88. cited by applicant .
Batavia et al., "Overtaking vehicle detection using implicit optical flow", Proceedings of the IEEE Transportation Systems Conference, Nov. 1997, pp. 729-734. cited by applicant .
Batavia, "Driver-Adaptive Lane Departure Warning Systems", The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania, 15213, Sep. 20, 1999. cited by applicant .
Bederson, "A miniature Space-Variant Active Vision System: Cortex-I", Masters or Ph.D. Thesis, Jun. 10, 1992. cited by applicant .
Begault, "Head-Up Auditory Displays for Traffic Collision Avoidance System Advisories: A Preliminary Investigation", Human Factors, 35(4), Dec. 1993, pp. 707-717. cited by applicant .
Behringer et al., "Simultaneous Estimation of Pitch Angle and Lane Width from the Video Image of a Marked Road," pp. 966-973, Sep. 12-16, 1994. cited by applicant .
Behringer, "Road recognition from multifocal vision", Intelligent Vehicles' 94 Symposium, Proceedings of the. IEEE, 1994, Abstract. cited by applicant .
Belt et al., "See-Through Turret Visualization Program", No. NATICK/TR-02/005. Honeywell Inc., Minn, MN Sensors and Guidance Products, 2002. cited by applicant .
Bensrhair et al., "A cooperative approach to vision-based vehicle detection" Intelligent Transportation Systems, IEEE, 2001. cited by applicant .
Bertozzi et al., "Obstacle and lane detection on ARGO", IEEE Transactions on Image Processing, 7(1):62-81, Jan. 1998, pp. 62-81. cited by applicant .
Bertozzi et al., "Performance analysis of a low-cost solution to vision-based obstacle detection", Intelligent Transportation Systems, 1999. Proc., Oct. 5-8, 1999, pp. 350-355. cited by applicant .
Bertozzi et al., "Vision-based intelligent vehicles: State of the art and perspectives" Robotics and Autonomous Systems, 32, 2000 pp. 1-16. cited by applicant .
Bertozzi et al., "Gold: a parallel real-time stereo vision system for generic obstacle and lane detection", IEEE transactions on image processing 7.1 (1998): 62-81. cited by applicant .
Betke et al., "Real-time multiple vehicle detection and tracking from a moving vehicle", Machine Vision and Applications, 2000. cited by applicant .
Beucher et al., "Road Segmentation and Obstacle Detection by a Fast Watershed Transformation", Intelligent Vehicles' 94 Symposium, Proceedings of the. IEEE, 1994. cited by applicant .
Blomberg et al., "NightRider Thermal Imaging Camera and HUD Development Program for Collision Avoidance Applications", Raytheon Commercial Infrared and ELCAN-Texas Optical Technologies, 2000, Abstract. cited by applicant .
Borenstein et al., "Where am I? Sensors and Method for Mobile Robot Positioning", University of Michigan, Apr. 1996, pp. 2, 125-128. cited by applicant .
Bosch, "Can Specification", Version 2.0, Sep. 1991. cited by applicant .
Bow, "Pattern Recognition and Image Preprocessing (Signal Processing and Communications)", CRC Press, Jan. 15, 2002, pp. 557-559. cited by applicant .
Brackstone et al., "Dynamic Behavioral Data Collection Using an Instrumented Vehicle", Transportation Research Record: Journal of the Transportation Research Board, vol. 1689, Paper 99/2535, 1999. cited by applicant .
Brandt, "A CRT Display System for a Concept Vehicle", SAE Paper No. 890283, published Feb. 1, 1989. cited by applicant .
Brauckmann et al., "Towards all around automatic visual obstacle sensing for cars", Intelligent Vehicles' 94 Symposium, Proceedings of the. IEEE, 1994. cited by applicant .
Britell et al., "Collision avoidance through improved communication between tractor and trailer" Proceedings: International Technical Conference on the Enhanced Safety of Vehicles. vol. 1998. National Highway Traffic Safety Administration, 1998. cited by applicant .
Broggi et al., "ARGO and the MilleMiglia in Automatico Tour", IEEE Intelligent Systems, Jan.-Feb. 1999, pp. 55-64. cited by applicant .
Broggi et al., "Architectural Issues on Vision-based automatic vehicle guidance: the experience of the ARGO Project", Academic Press, 2000. cited by applicant .
Broggi et al., "Automatic Vehicle Guidance: The Experience of the ARGO Vehicle", World Scientific Publishing Co., 1999. cited by applicant .
Broggi et al., "Multi-Resolution Vehicle Detection using Artificial Vision," IEEE Intelligent Vehicles Symposium of Jun. 14-17, 2004. cited by applicant .
Broggi et al., "Vision-based Road Detection in Automotive Systems: A real-time expectation-driven approach", Journal of Artificial Intelligence Research, 1995. cited by applicant .
Broggi, "Robust Real-time Lane and Road Detection in Critical Shadow Conditions", International Symposium on Computer Vision, IEEE, 1995, pp. 21-23. cited by applicant .
Brown, "A Survey of Image Registration Techniques", vol. 24, ACM Computing Surveys, pp. 325-376, Dec. 4, 1992. cited by applicant .
Brown, "Scene Segmentation and Definition for Autonomous Robotic Navigation Using Structured Light Processing", Doctoral Dissertation, University of Delaware, Army Science Conference Proceedings, Jun. 22-25, 1992, vol. 1, Dec. 31, 1988, pp. 189-203, Abstract. cited by applicant .
Brunelli et al., "Template Matching: Matched Spatial Filters and Beyond," Pattern Recognition, vol. 30, No. 5, 1997. cited by applicant .
Bucher et al., "Image processing and behavior planning for intelligent vehicles", IEEE Transactions on Industrial electronics 50.1 (2003): 62-75. cited by applicant .
Burger et al., "Estimating 3-D Egomotion from Perspective Image Sequences", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, No. 11, pp. 1040-1058, Nov. 1990. cited by applicant .
Burt et al., "A Multiresolution Spline with Application to Image Mosaics", ACM Transactions on Graphics, vol. 2. No. 4, pp. 217-236, Oct. 1983. cited by applicant .
Cardiles, "Implementation de la commande d'un vehicule electrique autonome grace a un capteur de distance et d'angle base sur une camera lineaire" IUP de Mathematiques Appliquees et Industrielles, May 8, 1998. cited by applicant .
Carley et al., "Synthesis Tools for Mixed-Signal ICs: Progress on Frontend and Backend Strategies," Proceedings of the 33rd Design Automation Conference, 1996. cited by applicant .
Cartledge, "Jaguar gives cat more lives", Birmingham Post, Oct. 10, 1991. cited by applicant .
Cassiano et al., "Review of filtering methods in mobile vision from ground vehicles in low light conditions", Proc. SPIE 1613, Mobile Robots VI, 322, Feb. 14, 1992. cited by applicant .
Chapuis et al., "Road Detection and Vehicles Tracking by Vision for an On-Board ACC System in the VELAC Vehicle", 2000. cited by applicant .
Charkari et al., "A new approach for real time moving vehicle detection", Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems, Yokohama, JP, Jul. 26-30, 1993. cited by applicant .
Chern et al., "The lane recognition and vehicle detection at night for a camera-assisted car on highway", Robotics and Automation, 2003. Proceedings. ICRA'03. IEEE International Conference on. vol. 2. IEEE, 2003, Abstract. cited by applicant .
Chien et al., "Efficient moving object segmentation algorithm using background registration technique", IEEE Transactions on Circuits and Systems for Video Technology, vol. 12., No. 7, Jul. 2002. cited by applicant .
Clune et al., "Implementation and performance of a complex vision system on a systolic array machine", Carnegie Mellon University, Jun. 15, 1987. cited by applicant .
CMOS sensor page of University of Edinburgh, 2015. cited by applicant .
Coghill, "Digital Imaging Technology 101", Albert Theuwissen, Dalsa Corp, 2003. cited by applicant .
Coifman et al., "A real-time computer vision system for vehicle tracking and traffic surveillance", Transportation Research Part C 6, pp. 271-288, 1998. cited by applicant .
Corsi, "Reconfigurable Displays Used as Primary Automotive Instrumentation", SAE Paper No. 890282, published Feb. 1, 1989. cited by applicant .
Crisman et al., "Color Vision for Road Following", Robotics Institute at Carnegie Mellon University, Proceedings of SPIE Conference on Mobile Robots Nov. 11, 1988, pp. 1-10, Oct. 12, 1988. cited by applicant .
Crisman et al., "UNSCARF, A Color Vision System for the Detection of Unstructured Roads" IEEE Paper 1991. cited by applicant .
Crisman et al., "Vision and Navigation--The Carnegie Mellon Navlab" Carnegie Mellon University, edited by Charles E. Thorpe, 1990. cited by applicant .
Crisman, "SCARF: Color vision system that tracks roads and intersections", IEEE, 1993. cited by applicant .
Crossland, "Beyond Enforcement: In-Car Video Keeps Officers on the Streets", Traffic technology international. Annual review, 1998, Abstract. cited by applicant .
Cucchiara et al., "Vehicle Detection under Day and Night Illumination", Proceedings of 3rd International ICSC Symposium on Intelligent Industrial Automation (IIA 99), 1999. cited by applicant .
Cucchiara et al., "Detecting moving objects, ghosts, and shadows in video streams", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, No. 10, 2003. cited by applicant .
Cucchiara et al., "Improving Shadow Suppression in Moving Object Detection with HSV Color Information", Proceeding of IEEE International Conference on Intelligent Transportation Systems, 2001. cited by applicant .
Curry et al., "The Lancashire telemedicine ambulance", Journal of Telemedicine and telecare 4.4 (1998): 231-238, Dec. 1, 1998, Abstract. cited by applicant .
Dagan et al., "Forward collision warning with a single camera", IEEE Intelligent Vehicles Symposium, 2004. cited by applicant .
Dally et al., "Digital Systems Engineering", The University of Cambridge, United Kingdom, 1998. cited by applicant .
Davis et al., "Road Boundary Detection for Autonomous Vehicle Navigation", Optical Engineering, vol. 25, No. 3, Mar. 1986, pp. 409-414. cited by applicant .
Davis, "Vision-Based Navigation for Autonomous Ground Vehicles" Defense Advanced Research Projects Agency, Jul. 18, 1988. cited by applicant .
De la Escalera et al., "Neural traffic sign recognition for autonomous vehicles" IEEE, 1994. cited by applicant .
De la Escalera et al., "Traffic sign recognition and analysis for intelligent vehicles", Division of Systems Engineering and Automation, Madrid, Spain, 2003. cited by applicant .
Decision--Motions--Bd. R. 125(a), issued Aug. 29, 2006 in connection with Interference No. 105,325, which involved U.S. patent application U.S. Appl. No. 09/441,341, filed Nov. 16, 1999 by Schofield et al. and U.S. Pat. No. 5,837,994, issued to Stam et al. cited by applicant .
DeFauw, "A System for Small Target Detection, Tracking, and Classification, Intelligent Transportation System", Intelligent Transportation Systems, 1999. Proceedings. 1999 IEEE/IEEJ/JSAI International Conference on. IEEE, 1999, Abstract. cited by applicant .
Denes et al., "Assessment of driver vision enhancement technologies," Proceedings of SPIE: Collusion Avoidance and Automated Traffic Management Sensors, vol. 2592, Oct. 1995. cited by applicant .
DeNuto et al., "LIN Bus and its Potential for use in Distributed Multiplex Applications", SAE Technical Paper 2001-01-0072, Mar. 5-8, 2001. cited by applicant .
Denyer et al., "On-Chip CMOS Sensors for VLSI Imaging Systems", Dept. of Elect. Engineering, University of Edinburgh, pp. 4b1.1-4b1.5, 1991. cited by applicant .
Derutin et al., "Real-time collision avoidance at road-crossings on board the Prometheus-ProLab 2 vehicle", Intelligent Vehicles' 94 Symposium, Proceedings of the. IEEE, 1994, Abstract. cited by applicant .
Devlin, "The Eyellipse and Considerations in the Driver's Forward Field of View," Society of Automotive Engineers, Inc., Detroit, MI, Jan. 8-12, 1968. cited by applicant .
Dickinson et al., "CMOS Digital Camera with Parallel Analog-to-Digital Conversion Architecture", Apr. 1995. cited by applicant .
Dickmanns et al., "A Curvature-based Scheme for Improving Road Vehicle Guidance by Computer Vision," University of Bundeswehr Munchen, 1986. cited by applicant .
Dickmanns et al., "Recursive 3-D road and relative ego-state recognition," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, No. 2, Feb. 1992. cited by applicant .
Dickmanns et al., "An integrated spatio-temporal approach to automatic visual guidance of autonomous vehicles," IEEE Transactions on Systems, Man, and Cybernetics, vol. 20, No. 6, Nov./Dec. 1990. cited by applicant .
Dickmanns, "Vehicles Capable of Dynamic Vision", Aug. 23, 1997. cited by applicant .
Dickmanns, "4-D dynamic vision for intelligent motion control", Universitat der Bundeswehr Munich, 1991. cited by applicant .
Dickmanns et al., "The seeing passenger car `VaMoRs-P`", Oct. 24, 1994. cited by applicant .
Dingus et al., "TRAVTEK Evaluation Task C3--Camera Car Study" Final Report/ 9-92 to 5-94. Jun. 1995. cited by applicant .
Donnelly Panoramic Vision.TM. on Renault Talisman Concept Car At Frankfort Motor Show, PR Newswire, Frankfort, Germany Sep. 10, 2001. cited by applicant .
Doudoumopoulos et al., "CMOS Active Pixel Sensor Technology for High Performance Machine Vision Applications," SME Applied Machine Vision '96--Emerging Smart Vision Sensors, Jun. 1996. cited by applicant .
Draves, "A Video Graphics Controller for Reconfigurable Automotive Displays", No. 970193. SAE Technical Paper Feb. 24, 1997, Abstract. cited by applicant .
Dubrovin et al., "Application of real-time lighting simulation for intelligent front-lighting studies", 2000 pp. 333-343. cited by applicant .
Dubuisson-Jolly, "Vehicle segmentation and classification using deformable templates", IEEE Transactions on Pattern Analysis and Machine Intelligence, Mar. 1996. cited by applicant .
Easton, "Jaguar Adapts Pilot's Night Sights for safer driving", The Times, Sep. 28, 1991. cited by applicant .
Eaton, "Video Incident Capture System", Technical Memorandum, OIC General Enforcement Branch, Sep. 1991. cited by applicant .
Eaton, "An RS-170 Camera for the Military Environment", Proc. SPIE 0979, Airborne Reconnaissance XII, Feb. 23, 1989, Abstract. cited by applicant .
Eid et al., "A 256 .times. 256 CMOS Active Pixel Image Sensor," Proceedings of SPIE: Charge-Coupled Devices and Solid State Optical Sensors V, vol. 2415, 1995. cited by applicant .
Elwell et al., "Near Infrared Spectroscopy," accessed at http://www.ucl.ac.uk/medphys/research/borl/intro/nirs, Jan. 6, 1999. cited by applicant .
Ernst et al., "Camera calibration for lane and obstacle detection" Intelligent Transportation Systems, 1999 pp. 356-361. cited by applicant .
Fancher et al. "Intelligent Cruise Control Field Operational Test (Final Report)", Final Report, vol. I: Technical Report, May 1998. cited by applicant .
Fancher et al., "Fostering Development, Evaluation, and Deployment of Forward Crash Avoidance Systems (FOCAS)" Annual Research Report DOT HS 808 437, May 1995. cited by applicant .
Ferryman et al., "Visual Surveillance for Moving Vehicles", Secure Project, 2000. cited by applicant .
Fletcher, "CMOS light-sensor process makes possible low-cost smart machine-vision systems" Penton Media, Inc. et al., 1993. cited by applicant .
Forsyth, "A System for Finding Changes in Colour", Oxford University, Jul. 23, 1987. cited by applicant .
Fossum, "Active Pixel Sensors: Are CCD's dinosaurs?" Proceedings of SPIE, Charge-Coupled Devices and Solid-State Optical Sensors III, vol. 1900, 1993. cited by applicant .
Fossum, "CMOS Active Pixel Sensor (APS) Technology for Multimedia Image Capture," 1997 Multimedia Technology & Applications Conference (MTAC97), 1997. cited by applicant .
Fossum, "Low power camera-on-a-chip using CMOS active pixel sensor technology", 1995 Symposium on Low Power Electronics, San Jose, CA, Oct. 9-10, 1995. cited by applicant .
Fowler et al., "A CMOS Area Image Sensor With Pixel-Level A/D Conversion," Digest of Technical Papers of the 41st Solid-State Circuits Conference (ISSCC), 2001. cited by applicant .
Franke et al., "Autonomous driving approaches downtown", IEEE Intelligent Systems, vol. 13, Nr. 6, 1999. cited by applicant .
French et al., "A comparison of IVHS progress in the United States, Europe, and Japan", IVHA America, Dec. 31, 1993. cited by applicant .
Fujimori, "CMOS Passive Pixel Imager Design Techniques", Massachusetts Institute of Technology, Ph.D. Dissertation for Electrical Engineering and Computer Science, Feb. 2002. cited by applicant .
Fung et al., "Effective moving cast shadow detection for monocular color image sequences", The 11th International Conference on Image Analysis and Processing Proceedings, Palermo, Italy, Sep. 26-28, 2001,p. 404-409. cited by applicant .
Gat et al., "A Monocular Vision Advance Warning System for the Automotive Aftemarket", Aftermarket SAE World Congress & Exhibition, No. 2005-1-1470. SAE Technical Paper, Jan. 1, 2005. cited by applicant .
Gavrila et al., "Real-Time Vision for Intelligent Vehicles" IEEE Instrumentation & Measurement Magazine, Jun. 2001, pp. 22-27. cited by applicant .
Gavrila, et al., "Real-time object detection for "smart" vehicles", 1999. cited by applicant .
Geary et al., "Passive Optical Lane Position Monitor" Idea Project Final Report Contract ITS-24, Jan. 15, 1996. cited by applicant .
Gehrig, "Design, simulation, and implementation of a vision-based vehicle- following system" Doctoral Dissertation, Jul. 31, 2000. cited by applicant .
GEM Muon Review Meeting--SSCL Abstract; GEM TN-03-433, Jun. 30, 1993. cited by applicant .
Goesch et al., "The First Head Up Display Introduced by General Motors", SAE Paper No. 890288, published Feb. 1, 1989. cited by applicant .
Goldbeck et al., "Lane detection and tracking by video sensors" Intelligent Transportation Systems, 1999. Proc., Oct. 5-8, 1999. cited by applicant .
Graefe et al., "Dynamic Vision for Precise Depth Measurement and Robot Control", Computer Vision for Industry, Jun. 1993. cited by applicant .
Graefe, "Vision for Intelligent Road Vehicles", Universitat de Bundeswehr Muchen, 1993, pp. 135-140. cited by applicant .
Greene et al., "Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter", IEEE Computer Graphics and Applications, vol. 6, No. 6, pp. 21-27, Jun. 1986. cited by applicant .
Gruss et al., "Integrated sensor and range-finding analog signal processor", IEEE Journal of Solid-State Circuits, vol. 26, No. 3, Mar. 1991. cited by applicant .
Gumkowski et al., "Reconfigurable Automotive Display System", SAE Paper No. 930456 to Gumkowski, published Mar. 1, 1993. cited by applicant .
Hall, "Why I Dislike auto-Dimming Rearview Mirrors," accessed at http://blog.consumerguide.com/why-i-dislike-autodimming-rearview-mirrors/-, Dec. 21, 2012. cited by applicant .
Hamit, "360-Degree Interactivity: New Video and Still Cameras Provide a Global Roaming Viewpoint", Advanced Imaging, Mar. 1997, p. 50. cited by applicant .
Haritaoglu et al., "W4: Real-Time Surveillance of People and Their Activities", IEEE Transactions Patter Analysis and Machine Intelligence, vol. 22, No. 8, Aug. 2000. cited by applicant .
Hebert et al., "3-D Vision Techniques for Autonomous Vehicles", Defense Advanced Research Projects Agency, Carnegie Mellon University, Feb. 1, 1988. cited by applicant .
Hebert et al., "Local Perception for Mobile Robot Navigation in Natural Terrain: Two Approaches", The Robotics Institute, Carnegie Mellon University, Abstract; Workshop on Computer Vision for Space Applications, Antibes, Sep. 22-24, 1993, pp. 24-31. cited by applicant .
Hebert, "Intelligent unmanned ground vehicles: autonomous navigation research", Carnegie Mellon (Kluwer Academic Publishers), Boston, 1997, Excerpt. cited by applicant .
Herbert et al., "3-D Vision Techniques for Autonomous Vehicles", Technical Report, Carnegie Mellon University, Aug. 1988. cited by applicant .
Hess et al., "A Control Theoretic Model of Driver Steering Behavior," IEEE Control Systems Magazine, vol. 10, No. 5, Aug. 1990, pp. 3-8. cited by applicant .
Hessburg et al., "An Experimental Study on Lateral Control of a Vehicle," California Partners for Advanced Transit and Highways (PATH), Jan. 1, 1991. cited by applicant .
Hillebrand et al., "High speed camera system using a CMOS image sensor", IEEE Intelligent Vehicles Symposium., Oct. 3-5, 1999, pp. 656-661, Abstract. cited by applicant .
Ho et al., "Automatic spacecraft docking using computer vision-based guidance and control techniques", Journal of Guidance, Control, and Dynamics, vol. 16, No. 2 Mar.-Apr. 1993. cited by applicant .
Hock et al., "Intelligent Navigation for Autonomous Robots Using Dynamic Vision", XVIIth ISPRS Congress, pp. 900-915, Aug. 14, 1992. cited by applicant .
Holst, "CCD Arrays, Cameras, and Displays", Second Edition, Bellingham, WA: SPIE Optical Engineering Press, 1998; pp. v-xxiii, 7-12, 45-101, and 176-179, excerpts. cited by applicant .
Honda Worldwide, "Honda Announces a Full Model Change for the Inspire." Jun. 18, 2003. cited by applicant .
Horprasert et al., "A Statistical Approach for Real-Time Robust Background Subtraction and Shadow Detection", Proceeding of IEEE International Conference on Computer vision Frame--Rate Workshop, 1999. cited by applicant .
Hsieh et al., "Shadow elimination for effective moving object detection by Gaussian shadow modeling", Image and Vision Computing, vol. 21, No. 6, 505-516, 2003. cited by applicant .
Hsieh et al., "A shadow elimination method for vehicle analysis", Proceeding of IEEE International Conference on Pattern Recognition, vol. 4, 2004. cited by applicant .
Hu et al., "Action-based Road Horizontal Shape Recognition", SBA Controle & Automacao, vol. 10, No. 2, May 1999. cited by applicant .
Huertgen et al., "Vehicle Environment Sensing by Video Sensors", No. 1999-01-0932. SAE Technical Paper, 1999, Abstract. cited by applicant .
Huijsing, "Integrated smart sensors", Sensors and Actuators A, vol. 30, Issues 1-2, pp. 167-174, Jan. 1992. cited by applicant .
Hutber et al., "Multi-sensor multi-target tracking strategies for events that become invisible" BMVC '95 Proc. of the 6th British conference on Machine vision, V2, 1995, pp. 463-472. cited by applicant .
IEEE 100--The Authoritative Dictionary of IEEE Standards Terms, 7.sup.th Ed. (2000). cited by applicant .
Ientilucci, "Synthetic Simulation and Modeling of Image Intensified CCDs (IICCD)", Master Thesis for Rochester Inst. of Tech., Mar. 31, 2000. cited by applicant .
Ishida et al., "Development of a Driver Assistance System", No. 2003-01-0279. SAE Technical Paper, 2002, Abstract. cited by applicant .
Ishihara et al., "Interline CCD Image Sensor with an Anti Blooming Structure," IEEE International Solid-State Circuits Conference, Session XIII: Optoelectronic Circuits, THPM 13.6, Feb. 11, 1982. cited by applicant .
Ishikawa et al., "Visual Navigation of an Autonomous Vehicle Using White Line Recognition", IEEE Transactions on Pattern Analysis and Machine Intelligence, 1988, Abst. cited by applicant .
Jaguar Press Releases Autumn 1991 "Jaguar Displays 21st Century Car Technologies", Jaguar Communications & Public Affairs Dept. cited by applicant .
Janssen et al., "Hybrid Approach for Traffic Sign Recognition", Program for a European Traffic with Highest Efficiency and Unprecendented Safety, Nov. 28, 1993. cited by applicant .
Japanese Article "Television Image Engineering Handbook, The Institute of Television Engineers of Japan", Jan. 17, 1981. cited by applicant .
Jochem et al., "PANS: a portable navigation platform", 1995 IEEE Symposium on Intelligent Vehicles, Detroit, MI, Sep. 25-26, 1995. cited by applicant .
Jochem et al., "Life in the Fast Lane", Al Magazine, vol. 17, No. 2, pp. 11-50, Summer 1996. cited by applicant .
Johannes, "A New Microchip Ushers in Cheaper Digital Cameras", The Wall Street Journal, Aug. 21, 1998, p. B1. cited by applicant .
Johnson, "Georgia State Patrol's In-Car Video System", Council of State Governments, 1992, Abstract. cited by applicant .
Juberts et al., "Development and Test Results for a Vision-Based Approach to AVCS." in Proceedings of the 26th International Symposium on Automotive Technology and Automation, Aachen, Germany, Sep. 1993, pp. 1-9. cited by applicant .
Kakinami et al., "Autonomous Vehicle Control System Using an Image Processing Sensor", No. 950470. SAE Technical Paper, Feb. 1, 1995, Abstract. cited by applicant .
Kan et al., "Model-based vehicle tracking from image sequences with an application to road surveillance," Purdue University, XP000630885, vol. 35, No. 6, Jun. 1996. cited by applicant .
Kang et al., "High Dynamic Range Video", ACM Transactions on Graphics, vol. 22, No. 3, 2003. cited by applicant .
Kassel, "Lunokhod-1 Soviet Lunar Surface Vehicle", Advanced Research Projects Agency, ARPA Order No. 189-1, Dec. 9, 1971. cited by applicant .
Kastrinaki et al., "A survey of video processing techniques for traffic applications", Image and Computing 21, 2003. cited by applicant .
Kehtarnavaz et al., "Traffic sign recognition in noisy outdoor scenes", 1995. cited by applicant .
Kehtarnavaz, "Visual control of an autonomous vehicle (BART)-the vehicle-following problem", IEEE Transactions on Vehicular Technology, Aug. 31, 1991, Abstract. cited by applicant .
Kemeny et al., "Multiresolution Image Sensor," IEEE Transactions on Circuits and Systems for Video Technology, vol. 7, No. 4, Aug. 1997. cited by applicant .
Kenue et al., "LaneLok: Robust Line and Curve Fitting of Lane Boundaries", Applications in Optical Science and Engineering, International Society for Optics and Photonics, 1993, Abstract. cited by applicant .
Kenue, "Lanelok: Detection of Lane Boundaries and Vehicle Tracking Using Image-Processing Techniques," SPIE Conference on Mobile Robots IV, 1989. cited by applicant .
Kidd et al., "Speed Over Ground Measurement", SAE Technical Paper Series, No. 910272, pp. 29-36, Feb.-Mar. 1991. cited by applicant .
Kiencke et al., "Automotive Serial controller Area Network," SAE Technical Paper 860391, 1986, retrieved from http://papers.sae.org/860391/, accessed Mar. 20, 2015. cited by applicant .
Klassen et al., "Sensor Development for Agricultural Vehicle Guidance", No. 932427. SAE Technical Paper, 1993, Abstract. cited by applicant .
Kluge et al., "Representation and Recovery of Road Geometry in YARF," Carnegie Mellon University, Proceedings of the IEEE, pp. 114-119, 1992. cited by applicant .
Knipling, "IVHS Technologies Applied to Collision Avoidance: Perspectives on Six Target Crash Types and Countermeasures," Technical Paper presented at Safety & Human Factors session of 1993 IVHS America Annual Meeting, Apr. 14-17, 1993, pp. 1-22. cited by applicant .
Knipling et al., "Vehicle-Based Drowsy Driver Detection: Current Status and Future Prospects," IVHS America Fourth Annual Meeting, Atlanta, GA, Apr. 17-20, 1994, pp. 1-24. cited by applicant .
Koller et al., "Binocular Stereopsis and Lane Marker Flow for Vehicle Navigation: Lateral and Longitudinal Control," University of California, Mar. 24, 1994. cited by applicant .
Kowalick, "Proactive use of highway recorded data via an event data recorder (EDR) to achieve nationwide seat belt usage in the 90th percentile by 2002" "Seat belt event data recorder (SB-EDR)"Transportation Recording: 2000 and Beyond., May 3-5, 1999, pp. 173-198, 369. cited by applicant .
Kozlowski et al., "Comparison of Passive and Active Pixel Schemes for CMOS Visible Imagers," Proceedings of SPIE Conference on Infrared Readout Electronics IV, vol. 3360, Apr. 1998. cited by applicant .
Krotkov, "An agile stereo camera system for flexible image acquisition", IEEE Journal on Robotics and Automation, Feb. 18, 1988. cited by applicant .
Kuan et al., "Autonomous Robotic Vehicle Road Following", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 5, Sep. 1988, pp. 648-658, Abstract. cited by applicant .
Kuehnle, "Symmetry-based recognition of vehicle rears", Pattern Recognition Letters 12, North-Holland, 1991. cited by applicant .
Kuhnert, "A vision system for real time road and object recognition for vehicle guidance," in Proc. SPIE Mobile Robot Conf., Cambridge, MA, Oct. 1986, pp. 267-272. cited by applicant .
Kweon et al., "Behavior-Based Intelligent Robot in Dynamic Indoor Environments", Proceedings of the 1992 IEEE/RSJ International Conference on Intelligent Robots and Systems, Jul. 7-10, 1992. cited by applicant .
Lasky et al., "Automated Highway Systems (AHS) Classification by Vehicle and Infrastructure", AHMT Research Report, Jan. 25, 1994. cited by applicant .
Leachtenauer, "Resolution requirements and the Johnson criteria revisited," Proceedings of SPIE, Infrared Imaging Systems: Design, Analysis, Modeling and Testing XIV, vol. 5076, 2003. cited by applicant .
LeBlanc et al., "CAPC: A Road-Departure Prevention System", IEEE, Dec. 1996, pp. 61-71. cited by applicant .
Lee et al., "Automatic recognition of a car license plate using color image processing", IEEE, Nov. 16, 1994. cited by applicant .
Lee, "How to Select a Heat Sink", Electronics Cooling Magazine, Jun. 1, 1995. cited by applicant .
Leen et al., "Digital networks in the automotive vehicle", Dec. 1999. cited by applicant .
Lezin, "Video Gear in Police Cruisers Gets Mixed Reviews Critics Say It Violates Privacy Rights and Inhibits Officers From Doing Their Jobs Well", Mar. 17, 1997. cited by applicant .
Linkwitz, "High Precision Navigation: Integration of Navigational and Geodetic Methods," Springer-Verlag, Jul. 5, 1989, Excerpt. cited by applicant .
Lisowski et al., "Specification of a small electric vehicle: modular and distributed approach," IEEE 1997, pp. 919-924. cited by applicant .
Litkouhi et al., "Estimator and Controller Design for LaneTrak, a Vision-Based Automatic Vehicle Steering System," Proceedings of the 32nd Conference on Decision and Control, San Antonio, Texas, Dec. 1993, pp. 1868-1873. cited by applicant .
Litwiller, "CCD vs. CMOS: Facts and Fiction," Photonics Spectra, Jan. 2001. cited by applicant .
Liu Xianghong, "Development of a vision-based object detection and recognition system for intelligent vehicle", 2000. cited by applicant .
Lockwood, "Design of an obstacle avoidance system for automated guided vehicles", Doctoral thesis, University of Huddersfield, Oct. 1991. cited by applicant .
Lowenau et al., "Adaptive light control a new light concept controlled by vehicle dynamics and navigation", SAE Technical Paper Series, Feb. 23-26, 1998. cited by applicant .
Lu et al., "On-chip Automatic Exposure Control Technique, Solid-State Circuits Conference", ESSCIRC '91. Proceedings--17th European (vol. 1) Abst. Sep. 11-13, 1991. cited by applicant .
Lucas Demonstrates Intelligent Cruise Control, Detroit Feb. 27, 1995 available at http://www.thefreelibrary.com/Lucas+Demonstrates+Intelligent+Cuise+Contro- l=a016602459. cited by applicant .
Luebbers et al., "Video-image-based neural network guidance system with adaptive view-angles for autonomous vehicles", Applications of Artificial Neural Networks II. International Society for Optics and Photonics, 1991, Abstract. cited by applicant .
Lumia, "Mobile system for measuring retroreflectance of traffic signs", Optics, Illumination, and Image Sensing for Machine Vision, Mar. 1, 1991, Abstract. cited by applicant .
Mackey et al., "Digital Eye-Witness Systems", Transportation Recording: 2000 and Beyond, May 3-5, 1999, 271-284. cited by applicant .
Malik et al., "A Machine Vision Based System for Guiding Lane-change Maneuvers", California Path Program, Institute of Transportation Studies, University of California, Berkeley, Sep. 1995. cited by applicant .
Manigel et al., "Computer control of an autonomous road vehicle by computer vision" Industrial Electronics, Control and Instrumentation, Proceedings. IECON '91, 1991 International Conference on, p. 19-24 vol. 1, 1991. cited by applicant .
Manigel et al., "Vehicle control by computer vision," Industrial Electronics, IEEE Transactions on, vol. 39, Issue 3, 181-188, Jun. 1992. cited by applicant .
Martel-Brisson et al., "Moving cast shadow detection from a Gaussian mixture shadow model", Proceeding of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, 2005. cited by applicant .
Masaki, "Vision-based vehicle guidance", Industrial Electronics, Control, Instrumentation, and Automation, 1992. Power Electronics and Motion Control, Proceedings of the 1992 International Conference on. IEEE, 1992. cited by applicant .
Mason et al., "The Golem Group I UCLA Autonomous Ground Vehicle in the DARPA Grand Challenge", Jun. 12, 2006. cited by applicant .
Matthews, "Visual Collision Avoidance," Oct. 1994, University of Southampton, PhD submission. cited by applicant .
Maurer et al., "VaMoRs-P: an advanced platform for visual autonomous road vehicle guidance", 1995. cited by applicant .
Maurer, "Flexible Automatisierung von StraBenfahrzeugen mit Rechnersehen" Universitat der Buneswehr Milnchen Dissertation, Jul. 27, 2000. cited by applicant .
MC68331 User's Manual, "Freescale Semiconductor", Inc., 1994. cited by applicant .
McKenna et al., "Tracking Groups of People", Computer Vision and Image Understanding, vol. 80, p. 42-56, 2000. cited by applicant .
McTamaney, "Mobile Robots Real-Time Intelligent Control", FMC Corporation, Winter 1987. cited by applicant .
Mei Chen et al., "AURORA: A Vision-Based Roadway Departure Warning System, The Robotics Institute", Carnegie Mellon University, published, Aug. 5-9, 1995. cited by applicant .
Mendis et al., "A 128.times.128 CMOS active pixel image sensor for highly integrated imaging systems", Dec. 8, 1993. cited by applicant .
Mendis et al., "CMOS Active Pixel Image Sensor," IEEE Transactions on Electron Devices, vol. 41, No. 3, Mar. 1994. cited by applicant .
Metzler, "Computer Vision Applied to Vehicle Operation", Paper from Society of Automotive Engineers, Inc., 1988. cited by applicant .
Mikic et al., "Moving shadow and object detection in traffic scenes", Proceeding of IEEE International Conference on Pattern Recognition, vol. 1, 2000. cited by applicant .
Miller, "Evaluation of vision systems for teleoperated land vehicles," IEEE Control Systems Magazine, Jun. 28, 1988. cited by applicant .
Mimuro et al., "Functions and Devices of Mitsubishi Active Safety ASV" Proceedings of the 1996 IEEE Intelligent Vehicles Symposium, Sep. 19-20, 1996, Abstract. cited by applicant .
Mironer et al., "Examination of Single Vehicle Roadway Departure Crashes and Potential IVHS Countermeasures," U.S. Department of Transportation, Aug. 1994. cited by applicant .
Miura et al., "Towards Vision-Based Intelligent Navigator: Its Concept and Prototype", IEEE Transactions on Intelligent Transportation Systems, Jun. 2002. cited by applicant .
Miura et al., "Towards intelligent navigator that can provide timely advice on safe and efficient driving" Intelligent Transportation Systems Proceedings, Oct. 5-8, 1999, pp. 981-986. cited by applicant .
Mobileye N.V. Introduces EyeQ.TM. Vision System-On-A-Chip High Performance, Low Cost Breakthrough for Driver Assistance Systems, Detroit, Michigan, Mar. 8, 2004. cited by applicant .
Moini, "Vision Chips or Seeing Silicon," Third Revision, Mar. 1997. cited by applicant .
Moravec, "Obstacle Avoidance and Navigation in the Real World by a Seeing Robot Rover", Computer Science Department, Stanford University, Ph.D. Thesis, Sep. 1980. cited by applicant .
Morgan et al., "Road edge tracking for robot road following: a real-time implementation," vol. 8, No. 3, Aug. 1990. cited by applicant .
Mori et al., "Shadow and Rhythm as Sign patterns of Obstacle Detection", Industrial Electronics, 1993. Conference Proceedings, ISIE'93-Budapest, IEEE International Symposium on. IEEE, 1993, Abstract. cited by applicant .
Morris, "E-Z-Pass and transmit using electronic toll tags for traffic monitoring" National Traffic Data Acquisition Conference, PDF pp. 54-63, 1996, 289- 298, Abstract. cited by applicant .
Muirhead, "Developments in CMOS Camera Technology," The Institution of Electrical Engineers, Dec. 5, 1994. cited by applicant .
Nadimi et al., "Physical models for moving shadow and object detection in video", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, No. 8, Aug. 2004. cited by applicant .
Najm, "Comparison of alternative crash-avoidance sensor technologies", Jan. 6, 1995, Abstract. cited by applicant .
Nashman et al., "Real-time Visual Processing for Autonomous Driving," in Proceedings of the IEEE Intelligent Vehicles, vol. 93, Jun. 1993, pp. 14-16. cited by applicant .
Nathan, "Digital Video Data Handling," NASA JPL Tech Report 32-877, Pasadena, CA, Jan. 5, 1966. cited by applicant .
Navon, "SoC IP Qualification & Emulation Environment", Dec. 8-9, 2004. cited by applicant .
Nguyen et al., "Obstacle detection using bi-spectrum CCD camera and image processing", Proceedings of the Intelligent Vehicles '92 Symposium, Jun. 29-Jul. 1, 1992, p. 42-50. cited by applicant .
Nixon et al., "128X128 CMOS Photodiode-Type Active Pixel Sensor With On-Chip Timing, Control and Signal Chain Electronics" 1995. cited by applicant .
Nixon et al., "256 .times. 256 CMOS Active Pixel Sensor Camera-on-a-Chip," IEEE Journal of Solid-State Circuits, vol. 31, No. 12, Paper FA 11.1, 1996. cited by applicant .
Nolan, "Survey of Electronic Displays", SAE Paper No. 750364, published Feb. 1, 1975. cited by applicant .
Oldenburg, "Comments on the Autronic Eye", 2002. cited by applicant .
Ortega et al., "An Interactive, Reconfigurable Display System for Automotive Instrumentation", SAE Paper No. 860173, published Mar. 1, 1986. cited by applicant .
Otsuka, "Flat Dot Matrix Display Module for Vehicle Instrumentation", SAE Paper No. 871288, published Nov. 8, 1987. cited by applicant .
Pacaud et al., "Ground Speed Sensing," Lucas International Symposium, Paris, France 1989. cited by applicant .
Paetzold, "Interpretation of visually sensed urban environment for a self-driving car" Ruhr-Universitat Bochum, Dissertation, Sep. 2000. cited by applicant .
Page et al., "Advanced technologies for collision avoidance," Eureka on Campus (Summer 1992). cited by applicant .
Paradiso et al., "Wide-Range Precision Alignment for the Gem Muon System," Oct. 1993. cited by applicant .
Paradiso, "Application of miniature cameras in video straightness monitor systems", Draper Laboratory, Jun. 1994. cited by applicant .
Paradiso, "Electronics for precision alignment of the Gem Muon System", Proceedings of the 1994 LeCroy Electronics for Future Colliders Conference, May 1994. cited by applicant .
Parent, "Automatic Driving for Small Public Urban Vehicles," Intelligent Vehicles Symposium, Tokyo, Jul. 14-16, 1993. cited by applicant .
Parker (ed.), McGraw-Hill Dictionary of Scientific and Technical Terms Fifth Edition. (1993). cited by applicant .
Parnell, "Reconfigurable Vehicle". No. 2002-01-0144. SAE Technical Paper, 2002. Xilinx WPI 53, Nov. 19, 2001. cited by applicant .
Pelco Fixed Focal Length Lenses Product Specification, Apr. 1996. cited by applicant .
Peng et al., "Experimental Automatic Lateral Control System for an Automobile," California Partners for Advanced Transit and Highways (PATH), Jan. 1, 1992. cited by applicant .
Peng, "Vehicle Lateral Control for Highway Automation," Ph.D. Thesis--University of California Berkeley, 1992. cited by applicant .
Philips Components, PCA82C200, Stand-alone CAN-controller, Jan. 22, 1991. cited by applicant .
Philomin et al., "Pedestrain Tracking from a Moving Vehicle", Proceedings of the IEEE, Intelligent Vehicles Symposium, IV, 2000. cited by applicant .
Piccioli et al., "Robust road sign detection and recognition from image sequences", 1994. cited by applicant .
Pollard, "Evaluation of the Vehicle Radar Safety Systems' Rashid Radar Safety Brake Collision Warning System", U.S. Dept. of Transportation, National Highway Traffic Safety Administration, Feb. 29, 1988. cited by applicant .
Pomerleau, "Alvinn: An Autonomous Land Vehicle in a Neural Network", Technical Report AIP-77 Department of Psychology, Carnegie Mellon University, Mar. 13, 1990. cited by applicant .
Pomerleau, "RALPH: Rapidly Adapting Lateral Position Handler", The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, pp. 506-511., 1995. cited by applicant .
Pomerleau et al., "Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures TASK 3-vol. 1", U.S. Dept. of Transportation, National Highway Traffic Safety Administration, Final Report, Aug. 23, 1995. cited by applicant .
Pomerleau et al., "Rapidly Adapting Machine Vision for Automated Vehicle Steering", pp. 19-27, Apr. 30, 1996. cited by applicant .
Pomerleau, "Run-Off-Road Collision Avoidance Using Ivhs Countermeasures", Robotics Institute, Task 6 Interim Report, Sep. 10, 1996. cited by applicant .
Porter et al., "Compositing Digital Images," Computer Graphics (Proc. Siggraph), vol. 18, No. 3, pp. 253-259, Jul. 1984. cited by applicant .
Prasad, "Performance of Selected Event Data Recorders", National Highway Traffic Safety Administration. Washington, DC, Sep. 2001. cited by applicant .
Prati et al., "Detecting moving shadows: algorithms and evaluation", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, Jul. 1, 2003. cited by applicant .
Pratt, "Digital Image Processing, Passage--ED.3", John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771. cited by applicant .
Priese et al., "New Results on Traffic Sign Recognition", IEEE Proceedings of the Intelligent Vehicles 1994 Symposium. cited by applicant .
Priese et al., "Traffic Sign Recognition Based on Color Image", Universitat Koblenz-Landau, 1993, pp. 95-100. cited by applicant .
Proceedings of the 1992 International Conference on Industrial Electronics, Control, Instrumentation, and Automation, 1992. Power Electronics and Motion Control, Date of Conference Nov. 9-13, 1992. cited by applicant .
Proceedings of the Intelligent Vehicles Symposium, 1992-present. cited by applicant .
Proceedings of the Intelligent Vehicles Symposium, Tokyo, Jul. 14-16, 1993. cited by applicant .
Pynn et al., "Automatic identification of cracks in road surfaces" 7th International Conference on Image Processing and its Application, CP465, Jan. 1999, pp. 671-675, Abstract. cited by applicant .
Raboisson et al., "Obstacle Detection in Highway Environment by Colour CCD Camera and Image Processing Prototype Installed in a Vehicle", Proceedings of the IEEE Intelligent Symposium 1994. cited by applicant .
Radatz, "The IEEE Standard Dictionary of Electrical and Electronics Terms," Sixth Edition, Standards Coordinating Committee 10, Terms and Definitions, 1996. cited by applicant .
Raglan Tribe Video-1994; 1994; Raglan Tribe; "Robot Car Raglan Tribe" http://www.youtube.com/watch?v=AILZhcnpXYI. cited by applicant .
Ramesh et al., "Real-Time Video Surveillance and Monitoring for Automotive Applications", SAE Technical Paper 2000-01-0347, Mar. 6, 2000, Abstract. cited by applicant .
Ran et al., "Development of Vision-based Vehicle Detection and Recognition System for Intelligent Vehicles", Department of Civil and Environmental Engineering, University of Wisconsin at Madison, 1999 TRB Annual Meeting, Nov. 16, 1998. cited by applicant .
Raphael et al., "Development of a Camera-Based Forward Collision Alert System", SAE International, Apr. 12, 2011. cited by applicant .
Rayner et al., "I-Witness Black Box Recorder" Intelligent Transportation Systems Program, Final Report for ITS-IDEA Project 84, Nov. 2001. cited by applicant .
Redmill, "The OSU Autonomous Vehicle", 1997. cited by applicant .
Regensburger et al., "Visual Recognition of Obstacles on Roads", Intelligent Robots and Systems, Elsevier, 1994. cited by applicant .
Reichardt, "Kontinuierliche Verhaltenssteuerung eines autonomen Fahrzeugs in dynamischer Umgebung" Universitat Kaisserslautern Dissertation, Transation: Continuous behavior control of an autonomous vehicle in a dynamic environment, Jan. 1996. cited by applicant .
Reid, "Vision-based guidance of an agriculture tractor", IEEE Control Systems Magazine, Apr. 30, 1987, Abstract. cited by applicant .
Reisman et al., "Crowd Detection in Video Sequences", IEEE, Intelligent Vehicles Symposium, Jan. 1, 2004. cited by applicant .
Ritter et al., "Traffic sign recognition using colour information", Math, Computing, Modelling, vol. 22, No. 4-7, pp. 149-161, Oct. 1995. cited by applicant .
Ritter, "Traffic Sign Recognition in Color Image Sequences", Institute for Information Technology, 1992, pp. 12-17. cited by applicant .
Roberts, "Attentive Visual Tracking and Trajectory Estimation for Dynamic Scene Segmentation", University of Southampton, PhD submission, Dec. 1994. cited by applicant .
Rombaut et al., "Dynamic data temporal multisensory fusion in the Prometheus ProLab2 demonstrator", IEEE Paper, 1994. cited by applicant .
Ross, "A Practical Stereo Vision System", The Robotics Institute, Carnegie Mellon University, Aug. 25, 1993. cited by applicant .
Rowell, "Applying Map Databases to Advanced Navigation and Driver Assistance Systems", The Journal of Navigation 54.03 (2001): 355-363. cited by applicant .
Sahli et al., "A Kalman Filter-Based Update Scheme for Road Following," IAPR Workshop on Machine Vision Applications, pp. 5-9, Nov. 12-14, 1996. cited by applicant .
Salvador et al., "Cast shadow segmentation using invariant color features", Computer Vision and Image Understanding, vol. 95, 2004. cited by applicant .
Sanders, "Speed Racers: Study to monitor driver behavior to determine the role of speed in crashes", Georgia Research Tech News, Aug. 2002. cited by applicant .
Sayer et al., "The Effect of Lead-Vehicle Size on Driver Following Behavior", University of Michigan Transportation Research Institute, 2000-15, Jun. 2000. cited by applicant .
Schneiderman et al., "Visual Processing for Autonomous Driving," IEEE Workshop on Applications of Computer Vision, Palm Springs, CA, Nov. 30-Dec. 2, 1992. cited by applicant .
Schonfeld et al., Compact Hardware Realization for Hough Based Extraction of Line Segments in Image Sequences for Vehicle Guidance, IEEE Paper, 1993, Abstract. cited by applicant .
Schumann et al., "An Exploratory Simulator Study on the Use of Active Control Devices in Car Driving," No. IZF-1992-B-2. Institute for Perception RVO-TNO Soesterber (Netherlands), May 1992. cited by applicant .
Schwarzinger et al., "Vision-based car-following: detection, tracking, and identification", Jul. 1, 1992. cited by applicant .
Scott, "Video Image on a Chip", Popular Science, vol. 237, No. 3, Sep. 1991, pp. 50. cited by applicant .
Seelen et al., "Image Processing for Driver Assistance", 1998. cited by applicant .
Seger et al., "Vision Assistance in Scenes with Extreme Contrast," IEEE Micro, vol. 13, No. 1, Feb. 1993. cited by applicant .
Shafer, "Automation and Calibration for Robot Vision Systems", National Science Foundation, Carnegie Mellon University Research Showcase, May 12, 1988. cited by applicant .
Shashua et al., "Two-body Segmentation from Two Perspective Views", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Hawaii, pp. 263-270, Dec. 2001, Abstract. cited by applicant .
Shashua et al., "Direct Estimation of Motion and Extended Scene Structure from a Moving Stereo Rig", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 1998, pp. 211-218. cited by applicant .
Shashua et al., "Illumination and View Position in 3D Visual Recognition", Advances in Neural Information Processing Systems, Morgan Kauffman Publishers, CA 1992 (Proc. of NIPS '91), pp. 404-411. cited by applicant .
Shashua et al., "Image-Based View Synthesis by Combining Trilinear Tensors and Learning Techniques", ACM Conference on Virtual Reality and Systems (VRST), Sep. 1997, pp. 140-145. cited by applicant .
Shashua et al., "Novel View Synthesis by Cascading Trilinear Tensors", IEEE Transactions on Visualization and Computer Graphics. vol. 4, No. 4, Oct.-Dec. 1998. cited by applicant .
Shashua et al., "On Degeneracy of Linear Reconstruction from Three Views: Linear Line Complex and Applications", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 21 (3), 1999, pp. 244-251. cited by applicant .
Shashua et al., "3D Reconstruction from Tangent-of-Sight ", European Conference on Computer Vision (ECCV), Jun. 2000, Dublin, Ireland, pp. 220-234. cited by applicant .
Shashua et al., "A Geometric Invariant for Visual Recognition and 3D Reconstruction From Two Perspective/Orthographic Views", Proceedings of the IEEE 2nd Qualitative Vision Workshop, Jun. 1993, New York, NY, pp. 107-117. cited by applicant .
Shashua et al., "A Parallel Decomposition Solver for SVM: Distributed Dual Ascend using Fenchel Duality", Conf. on Computer Vision and Pattern Recognition (CVPR), Jun. 2008, Anchorage, Alaska. cited by applicant .
Shashua et al., "A Unifying Approach to Hard and Probabilistic Clustering", International Conference on Computer Vision (ICCV), Beijing, China, Oct. 2005. cited by applicant .
Shashua et al., "Affine 3-D Reconstruction from Two Projective Images of Independently Translating Planes", International Conference on Computer Vision (ICCV), Jul. 2001, Vancouver, Canada, pp. 238-244. cited by applicant .
Shashua et al., "Algebraic Functions for Recognition", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI) vol. 17(8), Jan. 1994 pp. 779-789. cited by applicant .
Shashua et al., "Ambiguity from Reconstruction From Images of Six Points", International Conference on Computer Vision (ICCV), Jan. 1998, Bombay India, pp. 703-708. cited by applicant .
Shashua et al., "Convergent Message-Passing Algorithms for reference over General Graphs with Convex Free Energies", Conf. on Uncertainty in AI (UAI), Helsinki, Jul. 2008. cited by applicant .
Shashua et al., "Doubly Stochastic Normalization for Spectral Clustering", Advances in Neural Information Processing Systems (NIPS), Vancouver, Canada, Dec. 2006. cited by applicant .
Shashua et al., "Duality of multi-point and multi-frame geometry: Fundamental shape matrices and tensors", European Conference on Computer Vision (ECCV), Apr. 1996, Cambridge United Kingdom, pp. 217-227. cited by applicant .
Shashua et al., "Dynamic P.sup.n to P.sup.n Alignment", In Handbook of Computational Geometry for Pattern Recognition, Computer Vision. Neuro computing and Robotics. Eduardo Bayro-Corrochano (eds.), Springer-Verlag, 2004. cited by applicant .
Shashua et al., "Feature Selection for Unsupervised and Supervised Inference: the Emergence of Sparsity in a Weight-based Approach", Journal of Machine Learning Research (JMLR), 6(11):1885-1887, 2005, pp. 1885-1887. cited by applicant .
Shashua et al., "Grouping Contours by Iterated Pairing Network", Advances in Neural Information Processing Systems 3, (Proc. of NIPS '90), Morgan Kaufmann Publishers, CA, 1991, pp. 335-341. cited by applicant .
Shashua et al., "Nomography Tensors: On Algebraic Entities That Represent Three Views of Static or Moving Planar Points", European Conference on Computer Vision (ECCV), Jun. 2000, Dublin, Ireland, pp. 163-177. cited by applicant .
Shashua et al., "Join Tensors: on 3D-to-3D Alignment of Dynamic Sets", International Conference on Pattern Recognition (ICPR), Jan. 2000, Barcelona, Spain, pp. 99-102. cited by applicant .
Shashua et al., "Kernel Feature Selection with Side Data using a Spectral Approach", Proc. of the European Conference on Computer Vision (ECCV), May 2004, Prague, Czech Republic. cited by applicant .
Shashua et al., "Kernel Principal Angles for Classification Machines with Applications to Image Sequence Interpretation", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2003, Madison. cited by applicant .
Shashua et al., "Latent Model Clustering and Applications to Visual Recognition", International Conference on Computer Vision (ICCV), Rio, Brazil, Oct. 2007. cited by applicant .
Shashua et al., "Learning over Sets using Kernel Principal Angles", Journal of Machine Learning Research, 2003, pp. 913-931. cited by applicant .
Shashua et al., "Linear Image Coding for Regression and Classification using the Tensor-rank Principle", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Dec. 2001, Hawaii, pp. 42-49, Abstract. cited by applicant .
Shashua et al., "Manifold Pursuit: A New Approach to Appearance Based Recognition", International Conference on Pattern Recognition (ICPR), Aug. 2002, Quebec, Canada. cited by applicant .
Shashua et al., "Multi-frame Infinitesimal Motion Model for the Reconstruction of (Dynamic) Scenes with Multiple Linearly Moving Objects", International Conference on Computer Vision (ICCV), Jul. 2001 ,, Vancouver, Canada, pp. 592-599. cited by applicant .
Shashua et al., "Multiple View Geometry of Non-planar Algebraic Curves", International Conference on Computer Vision (ICCV), Vancouver, Canada, Jul. 2001, pp. 181-186. cited by applicant .
Shashua et al., "Structural Saliency: the Detection of Globally Salient Structures Using a Locally Connected Network", International Conference on Computer Vision (ICCV), Tarpon Springs, Florida, pp. 321-327, Jul. 1988. cited by applicant .
Shashua et al., "The Study of 3D-from-2D using Elimination", International Conference on Computer Vision (ICCV), Jun. 1995, Boston, MA, pp. 473-479. cited by applicant .
Shashua et al., "Multiple-view Geometry and Photometry, In Recent Progress in Computer Vision", Springer-Verlag, LNCS series, Invited papers of ACCV'95, Singapore Dec. 1995, 225-240, Abstract. cited by applicant .
Shashua et al., "Multiple-view geometry of general algebraic curves", International Journal of Computer Vision (IJCV), 2004. cited by applicant .
Shashua et al., "Multi-way Clustering Using Super-symmetric Non-negative Tensor Factorization", Proc. of the European Conference on Computer Vision (ECCV), Graz, Austria, May 2006. cited by applicant .
Shashua et al., "Nonnegative Sparse PCA", Advances in Neural Information Processing Systems (NIPS), Vancouver, Canada, Dec. 2006. cited by applicant .
Shashua et al., "Non-Negative Tensor Factorization with Applications to Statistics and Computer Vision", International Conference on Machine Learning (ICML), Bonn, Germany, Aug. 2005. cited by applicant .
Shashua et al., "Norm-Product Belief Propagation: Primal-Dual Message-Passing for Approximate Inference", IEEE Trans. on Information Theory, Jun. 28, 2010. cited by applicant .
Shashua et al., "Novel View Synthesis in Tensor Space", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 1997, pp. 1034-1040. cited by applicant .
Shashua et al., "Off-road Path Following using Region Classification and Geometric Projection Constraints", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2006, NY. cited by applicant .
Shashua et al., "Omni-Rig Sensors: What Can be Done With a Non-Rigid Vision Platform?", Workshop on Applications of Computer Vision (W ACV), pp. 174-179, Princeton, Oct. 1998, pp. 174-179. cited by applicant .
Shashua et al., "Omni-rig: Linear Self-recalibration of a Rig with Varying Internal and External Parameters," International Conference on Computer Vision (ICCV), Jul. 2001, Vancouver, Canada, pp. 135-141. cited by applicant .
Shashua et al., "On calibration and reconstruction from planar curves", European Conference on Computer Vision (ECCV), pp. 256-270, Jun. 2000, Dublin, Ireland, pp. 256-270. cited by applicant .
Shashua et al., "On Geometric and Algebraic Aspects of 3D Affine and Projective Structures from Perspective 2D Views", In Applications of Invariance in Computer Vision, Springer-Verlag LNCS No. 825, 1994, 127-143. cited by applicant .
Shashua et al., "On Photometric Issues in 3D Visual Recognition from a Single 2D Image", International Journal of Computer Vision (IJCV), 21(1/2), 1997 pp. 99-122. cited by applicant .
Shashua et al., "On Projection Matrices P.sup.k -P.sup.2, k=3, 6, and their Applications in Computer Vision", International Journal of Computer Vision (IJCV), 2002, pp. 53-67. cited by applicant .
Shashua et al., "On the Reprojection of 3D and 2D Scenes Without Explicit Model Selection", European Conference on Computer Vision (ECCV), Jun. 2000, Dublin, Ireland, pp. 468-482. cited by applicant .
Shashua et al., "On the Structure and Properties of the Quadrifocal Tensor", European Conference on Computer Vision (ECCV), Jun. 2000, Dublin, Ireland, pp. 354-368. cited by applicant .
Shashua et al., "On the Synthesis of Dynamic Scenes from Reference Views", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2000, pp. 133-139. cited by applicant .
Shashua et al., "pLSA for Sparse Arrays With Tsallis Pseudo-Additive, Divergence: Noise Robustness and Algorithm", International Conference on Computer Vision (ICCV), Rio, Brazil, Oct. 2007. cited by applicant .
Shashua et al., "Principal Component Analysis Over Continuous Subspaces and Intersection of Half-spaces", European Conference on Computer Vision (ECCV), May 2002, Copenhagen, Denmark, pp. 133-147. cited by applicant .
Shashua et al., "Probabilistic Graph and Hypergraph Matching", Conf. on Computer Vision and Pattern Recognition (CVPR), Jun. 2008, Anchorage, Alaska. cited by applicant .
Shashua et al., "Projective Structure from Uncalibrated Images: Structure from Motion and Recognition", IEEE Transactions on Pattern Analysis and Machine Intelligence (P AMI), (vol. 16(8), 1994, pp. 778-790. cited by applicant .
Shashua et al., "Q-warping: Direct Computation of Quadratic Reference Surfaces", IEEE Transactions on Pattern Analysis and Machine Intelligence (P AMI), vol. 23(8), 2001, pp. 920-925. cited by applicant .
Shashua et al., "Relative Affine Structure: Canonical Model for 3D from 2D Geometry and Applications," IEEE, Transactions on Pattern Analysis and Machine Intelligence (P AMI) vol. 18(9), pp. 873-883, Jun. 1994. cited by applicant .
Shashua et al., "Relative Affine Structure: Theory and Application for 3D Reconstruction From Perspective Views," IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, Washington, pp. 483-489, Jun. 1994. cited by applicant .
Shashua et al., "Revisiting Single-view Shape Tensors: Theory and Applications," EP Conference on Computer Vision (ECCV), Copenhagen, DK, pp. 256-270, May 2002. cited by applicant .
Shashua et al., "Robust Recovery of Camera Rotation from Three Frames," IEEE Conference on Computer Vision and Pattern Recognition (CVPR), San Francisco, CA, pp. 796-802, Jun. 1996. cited by applicant .
Shashua et al., "Shape Tensors for Efficient and Learnable Indexing", Proceedings of the workshop on Scene Representations, Jun. 1995, Cambridge, MA, pp. 58-65. cited by applicant .
Shashua et al., "ShareBoost: Efficient Multiclass Learning with Feature Sharing, Neural Information and Processing Systems (NIPS)", Dec. 2011. cited by applicant .
Shashua et al., "Sparse Image Coding using a 3D Non-negative Tensor Factorization", International Conference on Computer Vision (ICCV), Beijing, China, Oct. 2005. cited by applicant .
Shashua et al., "Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems", Advances in Neural Information Processing Systems (NIPS) Vancouver, Canada, Dec. 2002. cited by applicant .
Shashua et al., "Tensor Embedding of the Fundamental Matrix", Kluwer Academic Publishers, Boston, MA, 1998. cited by applicant .
Shashua et al., "The Quadric Reference Surface: Applications in Registering Views of Complex 3D Objects", European Conference on Computer Vision (ECCV), May 1994, Stockholm, Sweden, pp. 407-416. cited by applicant .
Shashua et al., "The Quadric Reference Surface: Theory and Applications", 1994. cited by applicant .
Shashua et al., "The Rank 4 Constraint in Multiple (.gtoreq.3) View Geometry", European Conference on Computer Vision (ECCV), Apr. 1996, Cambridge, United Kingdom, pp. 196-206. cited by applicant .
Shashua et al., "The Semi-Explicit Shape Model for Multi-object Detection and Classification", Proc. of the European Conference on Computer Vision (ECCV), Crete, Greece, pp. 336-349, Sep. 2010. cited by applicant .
Shashua et al., "Threading Fundamental Matrices", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 23(1), Jan. 2001, pp. 73-77. cited by applicant .
Shashua et al., "Threading Kernel functions: on Bridging the Gap between Representations and Algorithms", Advances in Neural Information Processing Systems (NIPS), Vancouver, Canada, Dec. 2004. cited by applicant .
Shashua et al., "Time-varying Shape Tensors for Scenes with Multiply Moving Points", IEEE Conference on Computer Vision and Pattern, pp. 623-630, Dec. 2001, Hawaii. cited by applicant .
Shashua et al., "Trajectory Triangulation over Conic Sections", International Conference on Computer Vision (ICCV), Greece, 1999, pp. 330-337. cited by applicant .
Shashua et al., "Trajectory Triangulation: 3D Reconstruction of Moving Points from a Monocular Image Sequence", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 22(4), 2000, pp. 348-357. cited by applicant .
Shashua et al., "Trilinear Tensor: The Fundamental Construct of Multiple-view Geometry and its Applications", International Workshop on Algebraic Frames for the Perception Action Cycle (AFPAC97), Kiel Germany, Sep. 8-9, 1997. Proceedings appeared in Springer-Verlag, LNCS series, 1997, 190-206. cited by applicant .
Shashua et al., "Trilinearity in Visual Recognition by Alignment", European Conference on Computer Vision (ECCV), May 1994, Stockholm, Sweden, pp. 479-484. cited by applicant .
Shashua et al., "Projective Depth: A Geometric Invariant for 3D Reconstruction From Two Perspective/Orthographic Views and for Visual Recognition," International Conference on Computer Vision (ICCV), May 1993, Berlin, Germany, pp. 583-590. cited by applicant .
Shashua et al., "The Quotient Image: Class Based Recognition and Synthesis Under Varying Illumination Conditions", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 1999, pp. 566-573. cited by applicant .
Shashua et al., "The Quotient Image: Class Based Re-rendering and Recognition With Varying Illuminations", IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 23(2), 2001, pp. 129-139. cited by applicant .
Shashua et al., "Pedestrian Detection for Driving Assistance, Systems: Single-Frame Classification and System Level, Performance", IEEE Intelligent Vehicles Symposium, Jan. 1, 2004. cited by applicant .
Shashua, "On the Relationship Between the Support Vector Machine for classification and Sparsified Fisher's Linear Discriminant," Neural Processing Letters, 1999, 9(2): 129-139. cited by applicant .
Shimizu et al., "A moving image processing system for personal vehicle system", Nov. 9, 1992, Abstract. cited by applicant .
Shirai, "Robot Vision", Future Generation Computer Systems, 1985. cited by applicant .
Shladover et al., "Automatic Vehicle Control Developments in the PATH Program," IEEE Transaction on Vehicular Technology, vol. 40, No. 1, Feb. 1991, pp. 114-130. cited by applicant .
Shladover, "Research and Development Needs for Advanced Vehicle Control Systems," Micro, IEEE, vol. 13, No. 1, Feb. 1993, pp. 11-19. cited by applicant .
Shladover, "Highway Electrification and Automation," California Partners for Advanced Transit and Highways (PATH), Jan. 1, 1992. cited by applicant .
Siala et al., "Moving shadow detection with support vector domain description in the color ratios space", Proceeding of IEEE International Conference on Pattern Recognition. vol. 4, 2004. cited by applicant .
Siegle, "Autonomous Driving on a Road Network," Proceedings of the Intelligent Vehicles '92 Symposium Detroit, Michigan, ISBN 0-7803-0747-X; Jun. 29-Jul 1, 1992. cited by applicant .
Smith et al., "An Automotive Instrument Panel Employing Liquid Crystal Displays", SAE Paper No. 770274, published Feb. 1, 1977. cited by applicant .
Smith et al., "Optical sensors for automotive applications", May 11, 1992. cited by applicant .
Smith et al., "Vision sensing for intelligent vehicle and highway systems", Proceedings of the 1994 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Las Vegas, NV, Oct. 5, 1994. cited by applicant .
Soatto et al., "The Golem Group/University of California at Los Angeles Autonomous Ground Vehicle in the DARPA Grand Challenge", Journal of Field Robotics 23(8), 2006, pp. 527-553. cited by applicant .
Solder et al., "Visual Detection of Distant Objects", Intelligent Robots and Systems' 93, IROS'93. Proceedings of the 1993 IEEE/RSJ International Conference on. vol. 2. IEEE, 1993, Abstract. cited by applicant .
Sole et al., "Solid or not solid: vision for radar target validation", IEEE Intelligent Vehicles Symposium, 2004. cited by applicant .
Sony Operating Manual CCD Color Video Camera Model: DXC-151A, 1993. cited by applicant .
Sparks et al., "Multi-Sensor Modules with Data Bus Communication Capability" SAE Technical Paper 1999-01-1277, Mar. 1, 1999, doi: 10.4271/1999-01-1277, http://papers.sae.org/1999-01-1277/, Abstract. cited by applicant .
Sridhar, "Multirate and event-driven Kalman filters for helicopter flight", IEEE Control Systems, Aug. 15, 1993. cited by applicant .
Standard J2284/3, "High-Speed CAN (HSC) for Vehicle Applications at 500 Kbps," issued May 30, 2001. cited by applicant .
Stauder et al., "Detection of moving cast shadows for object segmentation", IEEE Transactions on Multimedia, vol. 1, No. 1, Mar. 1999. cited by applicant .
Stein et al., "A Computer Vision System on a Chip: a case study from the automotive domain", IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2005. cited by applicant .
Stein et al., "Challenges and solutions for Bundling Multiple DAS Applications on a Single Hardware Platform", Procs. Vision 2008. cited by applicant .
Stein et al., "Direct Methods for Estimation of Structure and Motion from three views", A.I. Memo No. 1594, Ma Inst. of Tech., Nov. 1996. cited by applicant .
Stein et al., "Internal Camera Calibration using Rotation and Geometric Shapes", Submitted to the Dept. of Electrical Engineering and Computer Science at MA Inst. of Tech., Masters Thesis, M.I.T., Feb. 1993. cited by applicant .
Stein et al., "Model-based brightness constraints: on direct estimation of structure and motion," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, Issue 9, Sep. 2000. cited by applicant .
Stein et al., "Stereo-assist: Top-down stereo for driver assistance systems", IEEE Intelligent Vehicles Symposium, 2010. cited by applicant .
Stein et al., "Vision-based ACC with a single camera: bounds on range and range rate accuracy", IEEE Intelligent Vehicles Symposium, 2003. cited by applicant .
Stein et al., "A robust method for computing vehicle ego-motion", Proceedings of the IEEE Intelligent Vehicles Symposium, 2000. cited by applicant .
Stein, "Accurate Internal Camera Calibration using Rotation, with Analysis of Sources of Error", Computer Vision, Proceedings Fifth International Conference on. IEEE, 1995. cited by applicant .
Stein, "Geometric and photometric constraints: motion and structure from three views", Mass. Inst. of Tech., Doctoral Dissertation, 1998. cited by applicant .
Stein, "Lens Distortion Calibration Using Point Correspondences", A.I. Memo No. 1595, M.I.T. Artificial Intelligence Laboratory, Nov. 1996. cited by applicant .
Stein, "Tracking from multiple view points: Self-calibration of space and time", IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 1999. cited by applicant .
Stein et al., "Monitoring Activities from Multiple Video Streams: Establishing a Common Coordinate Frame," A.I. Memo No. 1655, M.I.T. Artificial Intelligence Laboratory, Apr. 1999. cited by applicant .
Steiner et al., "Future applications or microsystem technologies in automotive safety systems" Advanced Microsystems for Automotive Applications '98, 1998, pp. 21-42. cited by applicant .
Stengel et al., "Intelligent Guidance for Headway and Lane Control", Princeton University, Department of Mechanical and Aerospace Engineering, New Jersey, 1989. cited by applicant .
Stickford, "Candid cameras come to Park", Grosse Pointe News, Mar. 7, 1996. cited by applicant .
Stiller et al., "Multisensor obstacle detection and tracking", Image and Vision Computing 18, Elsevier, 2000, pp. 389-396. cited by applicant .
Sukthankar, "RACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night", Oct. 1992. cited by applicant .
Sun et al., "On-road vehicle detection using optical sensors: a review", 2004. cited by applicant .
Sun et al., "A Real-time Precrash Vehicle Detection System", 2002. cited by applicant .
Szeliski, "Image Mosaicing for Tele-Reality Applications", DEC Cambridge Research Laboratory, CRL 94/2, May 1994. cited by applicant .
Taktak et al., "Vehicle detection at night using image processing and pattern recognition", Centre de Recherche en Automatique de Nancy, 1994. cited by applicant .
Taylor, "CCD and CMOS Imaging Array Technologies: Technology Review," Xerox Research Centre Europe, Technical Report EPC-1998-106, 1998. cited by applicant .
Thomanek et al., "Multiple object recognition and scene interpretation for autonomous road vehicle guidance" Oct. 1994. cited by applicant .
Thomas, "Real-time vision guided navigation", Engineering Applications of Artificial Intelligence, Jan. 31, 1991, Abstract. cited by applicant .
Thongkamwitoon et al., "An adaptive real-time background subtraction and moving shadows detection", Proceeding of IEEE International Conference on Multimedia and Expo. vol. 2, 2004. cited by applicant .
Thorpe et al., "Perception for Outdoor Navigation First Year Report", Defense Advanced Research Projects Agency, Carnegie Mellong University, Dec. 31, 1990. cited by applicant .
Thorpe, "Vision and Navigation for the Carnegie-Mellon Navlab", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 3, May 1998. cited by applicant .
Thorpe, "1988 Year End Report for Road Following at Carnegie Mellon", Carnegie Mellon University, May 31, 1989. cited by applicant .
Thorpe et al., "Toward autonomous driving: the CMU Navlab. I. Perception", IEEE Paper, Aug. 1991. cited by applicant .
Thorpe et al., "The 1997 Automated Highway Free Agent Demonstration", 1997 pp. 496-501, 1997. cited by applicant .
Tokimaru et al., "CMOS Rear-View TV System with CCD Camera", National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988, Japan. cited by applicant .
Toth et al., "Detection of moving shadows using mean shift clustering and a significance test", Proceeding of IEEE International Conference on Pattern Recognition, vol. 4, 2004. cited by applicant .
Toyota Motor Corporation, "Present and future of safety technology development at Toyota." 2004. cited by applicant .
Trainor et al., "Architectural Synthesis of Digital Signal Processing Algorithms Using `IRIS`", Journal of VLSI Signal Processing Systems for Signal, Image and Video Technology, vol. 16, No. 1, 1997. cited by applicant .
Tremblay et al., "High resolution smart image sensor with integrated parallel analog processing for multiresolution edge extraction", Robotics and Autonomous Systems 11, pp. 231-242, with abstract, 1993. cited by applicant .
Tribe et al., "Collision Avoidance," Advances, Issue No. 4, May 1990. cited by applicant .
Tribe et al., "Collision Avoidance," Lucas International Symposium, Paris, France, 1989. cited by applicant .
Tribe et al., "Collision Warning," Autotech '93, Seminar 9, NEC Birmingham, UK, Nov. 1993. cited by applicant .
Tribe, "Intelligent Autonomous Systems for Cars, Advanced Robotics and Intelligent Machines," Peter Peregrinus, Nov. 1994. cited by applicant .
Trivdei et al., "Distributed Video Networks for Incident Detection and Management", Computer Vision and Robotics Research Laboratory, 2000. cited by applicant .
Tsugawa et al., "An automobile with artificial intelligence," in Proc. Sixth IJCAI, 1979. cited by applicant .
Tsugawa et al., "Vision-based vehicles in japan; machine vision systems and driving control systems", IEEE Transactions on Industrial Electronics, vol. 41, No. 4, Aug. 1994. cited by applicant .
Tsutsumi et al., "Vehicle Distance Interval Control Technology" Mitsubishi Electric Advance, Technical Reports, vol. 78, pp. 10-12, Mar. 1997. cited by applicant .
Turk et al., "VITS-A Vision System for Autonomous Land Vehicle Navigation," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 3, May 3, 1988. cited by applicant .
Ulmer, "VITA II--active collision avoidance in real traffic" Proceedings of the Intelligent Vehicles '94 Symposium, Oct. 24-26, 1994, Abstract. cited by applicant .
Valeo Infos News, "Valeo's revolutionary Lane Departure Warning System makes debut on Nissan Infiniti vehicles", 04.08 found at http://www.valeo.com/cwscontent/www.valeo.com/medias/fichiers/journaliste- s/en/CP/Idws_uk.pdf, Mar. 31, 2004. cited by applicant .
Van Leeuwen et al., "Motion Estimation with a Mobile Camera for Traffic Applications", IEEE, US, vol. 1, pp. 58-63, Oct. 3, 2000. cited by applicant .
Van Leeuwen et al., "Motion Interpretation for In-Car Vision Systems", IEEE, US, vol. 1,, p. 135-140, Sep. 30, 2002. cited by applicant .
Van Leeuwen et al., "Real-Time Vehicle Tracking in Image Sequences", IEEE, US, vol. 3, pp. 2049-2054, XP010547308, May 21, 2001. cited by applicant .
Van Leeuwen et al., "Requirements for Motion Estimation in Image Sequences for Traffic Applications", IEEE, pp. 354-359, XP002529773, 2000. cited by applicant .
Van Leeuwen et al., "Requirements for Motion Estimation in Image Sequences for Traffic Applications", IEEE, US, vol. 1, pp. 145-150, XP010340272, May 24, 1999. cited by applicant .
Vellacott, "CMOS in Camera," IEE Review, pp. 111-114, May 1994. cited by applicant .
Vlacic et al., "Intelligent Vehicle Technologies, Theory and Applications", Society of Automotive Engineers Inc., edited by SAE International, 2001. cited by applicant .
Vosselman et al., "Road traceing by profile matching and Kalman filtering", Faculty of Geodetic Engineering, 1995. cited by applicant .
Wallace et al., "Progress in Robot Road-Following," Proceedings of the 1986 IEEE International Conference on Robotics and Automation, vol. 3, pp. 1615-1621, 1986. cited by applicant .
Wan et al., "A New Edge Detector for Obstacle Detection with a Linear Stereo Vision System", Proceedings of the Intelligent Vehicles '95 Symposium, Abstract. cited by applicant .
Wang et al., "CMOS Video Cameras", article, 4 pages, University of Edinburgh, UK, 1991. cited by applicant .
Wang et al., "A probabilistic method for foreground and shadow segmentation", Proceeding of IEEE International Conference on Image Processing, Pattern Recognition, vol. 3, Oct. 2, 2003. cited by applicant .
Wang, "Camera Calibration by Vanishing Lines for 3-D Computer Vision", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, No. 4, Apr. 15, 1991. cited by applicant .
Webpage: http://parts.royaloakschevy.com/showAssembly.aspx?makeName=pontia- c&modelYear=1990&modelName=trans-sport&ukey_assembly=5888560&ukey_category- =53643&assembly=921201mu10-009mu10-009. cited by applicant .
Weisser et al., "Autonomous driving on vehicle test tracks: Overview, implementation and vehicle diagnosis" Intelligent Transportation Systems, pp. 62-67, Oct. 5-8, 1999, Abstract. cited by applicant .
Wierwille et al., "Research on Vehicle-Based Driver Status/Performance Monitoring, Part III" Final Report, Sep. 1996. cited by applicant .
Wilson, "Technology: A little camera with big ideas--the latest smart vision system," Financial Times, Jun. 17, 1993. cited by applicant .
Wolberg, "Digital Image Warping", IEEE Computer Society Press, 1990. cited by applicant .
Wolberg, "A Two-Pass Mesh Warping Implementation of Morphing," Dr. Dobb's Journal, No. 202, Jul. 1993. cited by applicant .
Wordenweber, "Driver assistance through lighting." ESV: 17th International Technical Conference on the Enhanced Safety of Vehicles. Report. No. 476. 2001. cited by applicant .
Wright, "Take your hands off that car!", Edn. vol. 42, No. 26, Dec. 18, 1997, Abstract. cited by applicant .
Wuller et al., "The usage of digital cameras as luminance meters", Proc. SPIE 6502, Digital Photography III, 65020U, Feb. 20, 2007; doi:10.1117/12.703205. cited by applicant .
Wyatt et al., "Analog VLSI systems for Image Acquisition and Fast Early Vision Processing", International Journal of Computer Vision, 8:3, pp. 217-223, 1992. cited by applicant .
Xie et al., "Active and Intelligent Sensing of Road Obstacles: Application to the European Eureka-PROMETHEUS Project", Fourth International Conference on Computer Vision, IEEE, 1993, Abstract. cited by applicant .
Xu et al., "3 DOF modular eye for smart car" School of Mechanical & Production Engineering Nanyang Technologies University, Intelligent Transportation Systems, 1999. Proc., Oct. 5-8, 1999, pp. 501-505. cited by applicant .
Xu et al., "Cast shadow detection in video segmentation", Pattern Recognition Letters, vol. 26, Nov. 4, 2003. cited by applicant .
Yadid-Pecht et al., "Wide Intrascene Dynamic Range CMOS APS Using Dual Sampling," IEEE Transactions on Electron Devices, vol. 44, No. 10, Oct. 1997. cited by applicant .
Yamada et al., "Wide Dynamic Range Vision Sensor for Vehicles," 1994 Vehicle Navigation & Information Systems Conference Proceedings, pp. 405-408, 1994. cited by applicant .
Yazigi, "Technology: Promethean Plans for Next Generation of Cars", The New York Times, Sep. 13, 1992. cited by applicant .
Yee, "Portable Camera Mount", Feb. 1986, Abstract. cited by applicant .
Yeh et al., "Image-Based Dynamic Measurement for Vehicle Steering Control", Proceedings of the Intelligent Vehicles '94 Symposium, 1994, Abstract. cited by applicant .
Yerazunis et al. "An inexpensive, all solid-state video and data recorder for accident reconstruction" Mitsubishi Technical Report TR-99-29 (Presented at the 1999 SAE International Congress and Exposition, Detroit, MI, Mar. 3, 1999.), Apr. 24, 1999. cited by applicant .
Yoneyama et al., "Moving cast shadow elimination for robust vehicle extraction based on 2D joint vehicle/shadow models", Proceeding of IEEE International Conference on Advanced Video and Signal Based Surveillance, 2003. cited by applicant .
Yoneyama et al., "Robust vehicle and traffic information extraction for highway surveillance", EURASIP Journal on Applied Signal Processing, pp. 2305-2321, 2005. cited by applicant .
Young et al., "Cantata: Visual Programming Environment for the Khoros System, ACM SIGGRAPH Computer Graphics-Special focus: modular visualization environments (MVEs)", vol. 29, issue 2, Mar. 16, 1995. cited by applicant .
Young et al., "Improved Obstacle Detection by Sensor Fusion", IEEE Colloquium on "Prometheus and DRIVE", Oct. 15, 1992, Abstract. cited by applicant .
Yu et al., "Vehicles Recognition by Video Camera" 1995. cited by applicant .
Yu, "Road tracking, lane segmentation and obstacle recognition by mathematical morphology," Intelligent Vehicles '92 Symposium, Proceedings of the IEEE 1992 Conference, p. 166-172. cited by applicant .
Yuji et al., "Accidents and Near-Misses Analysis by Using Video Drive- Recorders in a Fleet Test", Proceedings of the 17th International Technical Conference on the Enhanced Safety of Vehicles (ESV) Conference, Jun. 4-7, 2001 Amsterdam, The Netherlands, National Highway Traffic Safety Administration, Washington, DC. HS 809 20, Jun. 2001. cited by applicant .
Zheng et al., "An Adaptive System for Traffic Sign Recognition," IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170, Oct. 1994. cited by applicant .
Zidek, "Lane Position Tracking", Aerospace and Electronics Conference, National Proceedings of the IEEE 1994, Abstract. cited by applicant .
Zigman, "Light Filters to Improve Vision", Optometry and Vision Science, vol. 69, No. 4, pp. 325-328, Apr. 15, 1992. cited by applicant.

Главный эксперт: Pham; Toan N
Уполномоченный, доверенный или фирма: Honigman Miller Schwartz and Cohn, LLP

Текст решения-прецедента





ПЕРЕКРЕСТНЫЕ ССЫЛКИ НА РОДСТВЕННЫЕ ЗАЯВКИ



The present application is a continuation of U.S. patent application Ser. No. 15/413,462, filed Jan. 24, 2017, now U.S. Pat. No. 9,834,216, which is a continuation of U.S. patent application Ser. No. 15/155,350, filed May 16, 2016, now U.S. Pat. No. 9,555,803, which is a continuation of U.S. patent application Ser. No. 14/922,640, filed Oct. 26, 2015, which is a continuation of U.S. patent application Ser. No. 14/195,137, filed Mar. 3, 2014, now U.S. Pat. No. 9,171,217, which is a continuation of U.S. patent application Ser. No. 13/651,659, filed Oct. 15, 2012, now U.S. Pat. No. 8,665,079, which is a continuation of U.S. patent application Ser. No. 12/559,856, filed Sep. 15, 2009, now U.S. Pat. No. 8,289,142, which is a divisional application of U.S. patent application Ser. No. 12/329,029, filed Dec. 5, 2008, now U.S. Pat. No. 7,679,498, which is a divisional application of U.S. patent application Ser. No. 11/408,776, filed Apr. 21, 2006, now U.S. Pat. No. 7,463,138, which is a divisional application of U.S. patent application Ser. No. 10/427,051, filed Apr. 30, 2003, now U.S. Pat. No. 7,038,577, which claims priority of U.S. provisional applications, Ser. No. 60/433,700, filed Dec. 16, 2002; and Ser. No. 60/377,524, filed May 3, 2002, which are all hereby incorporated herein by reference in their entireties.

ФОРМУЛА ИЗОБРЕТЕНИЯ



The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:

1. A vehicular control system, said vehicular control system comprising: a plurality of cameras disposed at a vehicle equipped with said vehicular control system; said plurality of cameras at least comprising (i) a forward-viewing camera mounted at a front portion of the equipped vehicle and having a field of view at least forward of the equipped vehicle, said forward-viewing camera operable to capture image data, (ii) a driver side-viewing camera mounted at a driver-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the driver side of the equipped vehicle, said driver side-viewing camera operable to capture image data and (iii) a passenger side-viewing camera mounted at a passenger-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the passenger side of the equipped vehicle, said passenger side-viewing camera operable to capture image data; at least one radar sensor having a field of sensing exterior of the equipped vehicle and operable to sense radar data; a control comprising at least one processor; wherein image data captured by at least said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera is provided to said control; wherein radar data sensed by said at least one radar sensor is provided to said control; wherein said control, based at least in part on processing of captured image data, determines curvature of a road being traveled by the equipped vehicle; wherein said control processes captured image data provided thereto via an edge detection algorithm to detect objects present exteriorly of the equipped vehicle and within the exterior field of view of at least one of said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera; wherein said control is operable to determine whether a detected edge constitutes a portion of a vehicle; wherein said control processes sensed radar data provided thereto to detect objects present exteriorly of the equipped vehicle and within the exterior field of sensing of said at least one radar sensor; wherein said control receives data relevant to a geographic location of the equipped vehicle, and wherein the geographic location of the equipped vehicle is determined, at least in part, by a global positioning system (GPS); wherein said control, based at least in part on processing of at least one of (i) captured image data and (ii) sensed radar data, detects another vehicle that is present exterior of the equipped vehicle and determines distance from the equipped vehicle to the detected other vehicle that is present exterior of the equipped vehicle; and wherein said control, based at least in part on determination of distance from the equipped vehicle to the detected other vehicle, controls a steering system operable to adjust a steering direction of the equipped vehicle.

2. The vehicular control system of claim 1, wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle, and wherein information wirelessly communicated to said remote receiving device relates to data received at said control relevant to the geographic location of the equipped vehicle.

3. The vehicular control system of claim 1, wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle, and wherein information wirelessly communicated to said remote receiving device relates to data received at said control relevant to a condition of the equipped vehicle.

4. The vehicular control system of claim 1, wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle, and wherein information is wirelessly communicated to said remote receiving device in response to an input from a transmitter associated with said remote receiving device.

5. The vehicular control system of claim 4, wherein information is wirelessly communicated to said remote receiving device via a limited-range wireless communication.

6. The vehicular control system of claim 5, wherein said limited-range wireless communication comprises a Bluetooth limited-range wireless communication.

7. The vehicular control system of claim 1, wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle, and wherein information wirelessly communicated to said remote receiving device comprises information derived, at least in part, from image data captured by at least one of said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera.

8. The vehicular control system of claim 7, wherein information wirelessly communicated to said remote receiving device comprises data relevant to the geographic location of the equipped vehicle.

9. The vehicular control system of claim 1, wherein said control, responsive to determination, based at least in part on processing of captured image data at said control, that the equipped vehicle is unintentionally drifting out of a traffic lane that the equipped vehicle is currently travelling in, controls the steering system of the equipped vehicle to adjust steering of the equipped vehicle to mitigate such drift out of the traffic lane the equipped vehicle is travelling in.

10. The vehicular control system of claim 9, wherein said control processes image data captured by at least said forward-viewing camera to estimate distance from the equipped vehicle to a detected leading vehicle that is present exteriorly of the equipped vehicle and within the exterior field of view of said forward-viewing camera, and wherein, when the detected leading vehicle is determined by said control to be within a threshold distance from the equipped vehicle, speed of the equipped vehicle is reduced.

11. The vehicular control system of claim 10, wherein said forward-viewing camera is primary to said at least one radar sensor.

12. The vehicular control system of claim 10, wherein the steering system is manually controllable irrespective of said control.

13. The vehicular control system of claim 1, wherein, responsive at least in part to processing of captured image data at said control detecting a road condition on the road being traveled by the equipped vehicle, speed of the equipped vehicle is adjusted in accordance with the detected road condition.

14. The vehicular control system of claim 1, wherein, responsive at least in part to processing of captured image data at said control, speed of the equipped vehicle is adjusted in accordance with a traffic condition detected by said vehicular control system.

15. The vehicular control system of claim 1, wherein said forward-viewing camera is attached at a windshield of the equipped vehicle, and wherein said forward-viewing camera, when attached at the windshield, views through the windshield exteriorly of the equipped vehicle.

16. The vehicular control system of claim 15, wherein image data captured by at least said forward-viewing camera is processed at said control one way for a headlamp control system of the equipped vehicle and is processed at said control another way for a lane keeping system of the equipped vehicle.

17. The vehicular control system of claim 15, wherein said control, responsive to processing at said control of image data captured by at least said forward-viewing camera, detects headlights of oncoming vehicles within the exterior field of view of said forward-viewing camera when the equipped vehicle is operated under nighttime conditions.

18. The vehicular control system of claim 15, wherein said control, responsive to processing at said control of image data captured by at least said forward-viewing camera, detects taillights of leading vehicles within the exterior field of view of said forward-viewing camera when the equipped vehicle is operated under nighttime conditions.

19. The vehicular control system of claim 18, wherein said control, responsive to determination, based at least in part on processing of captured image data at said control, that the equipped vehicle is unintentionally drifting out of a traffic lane that the equipped vehicle is currently travelling in, controls the steering system of the equipped vehicle to adjust steering of the equipped vehicle to mitigate such drift out of the traffic lane the equipped vehicle is travelling in.

20. The vehicular control system of claim 19, wherein the steering system is manually controllable by a driver of the equipped vehicle irrespective of control by said control.

21. The vehicular control system of claim 1, wherein said driver side-viewing camera is mounted at a driver-side exterior mirror assembly of the equipped vehicle, and wherein said passenger side-viewing camera is mounted at a passenger-side exterior mirror assembly of the equipped vehicle.

22. The vehicular control system of claim 21, wherein said control receives image data captured by a rear-viewing camera mounted at a rear portion of the equipped vehicle and having a field of view at least rearward of the equipped vehicle.

23. The vehicular control system of claim 22, wherein image data captured by said rear-viewing camera at the rear portion of the equipped vehicle and by one or more of said driver side-viewing camera at the driver-side exterior mirror assembly of the equipped vehicle and said passenger side-viewing camera at the passenger-side exterior mirror assembly of the equipped vehicle is provided to a panoramic vision system of the equipped vehicle.

24. The vehicular control system of claim 1, wherein said control, responsive at least in part to processing of captured image data at said control, controls an adaptive cruise control system of the equipped vehicle.

25. The vehicular control system of claim 1, wherein said control processes image data captured by at least said forward-viewing camera for stop light recognition.

26. The vehicular control system of claim 1, wherein said control processes image data captured by at least said forward-viewing camera for traffic sign recognition.

27. The vehicular control system of claim 1, wherein, responsive at least in part to processing at said control of captured image data detecting a curve in the road ahead of the equipped vehicle, speed of the equipped vehicle is reduced to an appropriate speed for traveling around the detected curve, and wherein speed of the equipped vehicle increases after travelling around the detected curve to a speed appropriate for travelling along a generally straight section of road that comes after the curve.

28. The vehicular control system of claim 1, wherein, when the detected other vehicle that is present exterior of the equipped vehicle is determined by said control to be within a threshold distance from the equipped vehicle, speed of the equipped vehicle is reduced.

29. The vehicular control system of claim 1, wherein said control processes captured image data and sensed radar data for an adaptive cruise control system of the equipped vehicle, and wherein, during operation of said adaptive cruise control system, captured image data is primary to sensed radar data.

30. The vehicular control system of claim 1, wherein said control processes captured image data and sensed radar data for an adaptive cruise control system of the equipped vehicle, and wherein, during operation of said adaptive cruise control system, captured image data is secondary to sensed radar data.

31. The vehicular control system of claim 1, wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a driver side-sensing radar sensor mounted at a driver-side portion of the equipped vehicle, said driver side-sensing radar sensor having a field of sensing at least sideward and rearward of the equipped vehicle, and wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a passenger side-sensing radar sensor mounted at a passenger-side portion of the equipped vehicle, said passenger side-sensing radar sensor having a field of sensing at least sideward and rearward of the equipped vehicle.

32. The vehicular control system of claim 31, wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a forward-sensing radar sensor mounted at a front portion of the equipped vehicle, said forward-sensing radar sensor having a field of sensing at least forward of the equipped vehicle.

33. The vehicular control system of claim 32, wherein said control, responsive at least in part to processing at said control of radar sensor data sensed by at least said forward-sensing radar sensor, controls an adaptive cruise control system of the equipped vehicle, and wherein said control, based at least in part on processing at said control of radar sensor data sensed by at least said forward-sensing radar sensor, detects the other vehicle that is present exterior of and ahead of the equipped and determines distance from the equipped vehicle to the detected other vehicle that is present exterior of and ahead of the equipped vehicle.

34. The vehicular control system of claim 33, wherein said control, based at least in part on processing of captured image data, detects the other vehicle that is present exterior of and ahead of the equipped vehicle.

35. The vehicular control system of claim 1, wherein said at least one radar sensor comprises a Doppler radar sensor.

36. A vehicular control system, said vehicular control system comprising: a plurality of cameras disposed at a vehicle equipped with said vehicular control system; said plurality of cameras at least comprising (i) a forward-viewing camera mounted at a front portion of the equipped vehicle and having a field of view at least forward of the equipped vehicle, said forward-viewing camera operable to capture image data, (ii) a driver side-viewing camera mounted at a driver-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the driver side of the equipped vehicle, said driver side-viewing camera operable to capture image data and (iii) a passenger side-viewing camera mounted at a passenger-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the passenger side of the equipped vehicle, said passenger side-viewing camera operable to capture image data; wherein said forward-viewing camera is attached at a windshield of the equipped vehicle, and wherein said forward-viewing camera, when attached at the windshield, views through the windshield exteriorly of the equipped vehicle; at least one radar sensor having a field of sensing exterior of the equipped vehicle and operable to sense radar data; wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a driver side-sensing radar sensor mounted at a driver-side portion of the equipped vehicle, said driver side-sensing radar sensor having a field of sensing at least sideward and rearward of the equipped vehicle; wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a passenger side-sensing radar sensor mounted at a passenger-side portion of the equipped vehicle, said passenger side-sensing radar sensor having a field of sensing at least sideward and rearward of the equipped vehicle; wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a forward-sensing radar sensor mounted at a front portion of the equipped vehicle, said forward-sensing radar sensor having a field of sensing at least forward of the equipped vehicle; a control comprising at least one processor; wherein image data captured by at least said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera is provided to said control; wherein radar data sensed by at least said driver side-sensing radar sensor, said passenger side-sensing radar sensor and said forward-sensing radar sensor is provided to said control; wherein said control, responsive to processing at said control of image data captured at least by said forward-viewing camera, detects lane markers within the exterior field of view of at least said forward-viewing camera on a road being traveled by the equipped vehicle; wherein said control processes captured image data provided thereto via an edge detection algorithm to detect objects present exteriorly of the equipped vehicle and within the exterior field of view of at least one of said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera; wherein said control processes sensed radar data provided thereto to detect objects present exteriorly of the equipped vehicle and within the exterior field of sensing of said at least one radar sensor; wherein said control receives data relevant to a geographic location of the equipped vehicle, and wherein the geographic location of the equipped vehicle is determined, at least in part, by a global positioning system (GPS); wherein said control, based at least in part on processing of at least one of (i) captured image data and (ii) sensed radar data, detects another vehicle that is present exterior of the equipped vehicle and determines distance from the equipped vehicle to the detected other vehicle that is present exterior of the equipped vehicle; wherein said control, based at least in part on processing of at least one of (i) captured image data and (ii) sensed radar data, controls a steering system of the equipped vehicle; and wherein the steering system is manually controllable irrespective of said control.

37. The vehicular control system of claim 36, wherein said control, responsive to determination, based at least in part on processing of captured image data at said control, that the equipped vehicle is unintentionally drifting out of a traffic lane that the equipped vehicle is currently travelling in, controls the steering system of the equipped vehicle to adjust steering of the equipped vehicle to mitigate such drift out of the traffic lane the equipped vehicle is travelling in.

38. The vehicular control system of claim 37, wherein said control, based at least in part on said detection of the other vehicle present exterior of the equipped vehicle and said determination of distance from the equipped vehicle to the detected other vehicle, determines whether it is safe for the equipped vehicle to execute a lane change maneuver.

39. The vehicular control system of claim 38, wherein said control, responsive to processing at said control of image data captured by at least said forward-viewing camera, detects taillights of leading vehicles within the exterior field of view of said forward-viewing camera when the equipped vehicle is operated under nighttime conditions.

40. The vehicular control system of claim 39, wherein said control processes captured image data and sensed radar data for an adaptive cruise control system of the equipped vehicle.

41. The vehicular control system of claim 40, wherein, responsive at least in part to processing at said control of captured image data detecting a curve in the road ahead of the equipped vehicle, speed of the equipped vehicle is reduced to an appropriate speed for traveling around the detected curve, and wherein speed of the equipped vehicle increases after travelling around the detected curve to a speed appropriate for travelling along a generally straight section of road that comes after the curve.

42. The vehicular control system of claim 36, wherein said driver side-viewing camera is mounted at a driver-side exterior mirror assembly of the equipped vehicle, and wherein said passenger side-viewing camera is mounted at a passenger-side exterior mirror assembly of the equipped vehicle, and wherein said control receives image data captured by a rear-viewing camera mounted at a rear portion of the equipped vehicle and having a field of view at least rearward of the equipped vehicle, and wherein image data captured by said rear-viewing camera at the rear portion of the equipped vehicle and by one or more of said driver side-viewing camera at the driver-side exterior mirror assembly of the equipped vehicle and said passenger side-viewing camera at the passenger-side exterior mirror assembly of the equipped vehicle is provided to a panoramic vision system of the equipped vehicle.

43. The vehicular control system of claim 36, wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle, and wherein information wirelessly communicated to said remote receiving device comprises information derived, at least in part, from image data captured by at least one of said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera.

44. The vehicular control system of claim 43, wherein information wirelessly communicated to said remote receiving device comprises data relevant to the geographic location of the equipped vehicle.

45. The vehicular control system of claim 36, wherein objects present exteriorly of the equipped vehicle comprise a vehicle, and wherein said control determines that an edge detected via the edge detection algorithm constitutes a portion of the vehicle present exteriorly of the equipped vehicle and tracks the detected edge over multiple frames of captured image data.

46. A vehicular control system, said vehicular control system comprising: a plurality of cameras disposed at a vehicle equipped with said vehicular control system; said plurality of cameras at least comprising (i) a forward-viewing camera mounted at a front portion of the equipped vehicle and having a field of view at least forward of the equipped vehicle, said forward-viewing camera operable to capture image data, (ii) a driver side-viewing camera mounted at a driver-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the driver side of the equipped vehicle, said driver side-viewing camera operable to capture image data and (iii) a passenger side-viewing camera mounted at a passenger-side portion of the equipped vehicle and having a field of view at least sideward and rearward of the equipped vehicle at the passenger side of the equipped vehicle, said passenger side-viewing camera operable to capture image data; wherein said forward-viewing camera is attached at a windshield of the equipped vehicle, and wherein said forward-viewing camera, when attached at the windshield, views through the windshield exteriorly of the equipped vehicle; at least one radar sensor having a field of sensing exterior of the equipped vehicle and operable to sense radar data; a control comprising at least one processor; wherein image data captured by at least said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera is provided to said control; wherein radar data sensed by said at least one radar sensor is provided to said control; wherein said control processes captured image data provided thereto via an edge detection algorithm to detect objects present exteriorly of the equipped vehicle and within the exterior field of view of at least one of said forward-viewing camera, said driver side-viewing camera and said passenger side-viewing camera; wherein said control is operable to determine whether a detected edge constitutes a portion of a vehicle; wherein said control processes sensed radar data provided thereto to detect objects present exteriorly of the equipped vehicle and within the exterior field of sensing of said at least one radar sensor; wherein said control receives data relevant to a geographic location of the equipped vehicle, and wherein the geographic location of the equipped vehicle is determined, at least in part, by a global positioning system (GPS); wherein said control, based at least in part on processing of at least one of (i) captured image data and (ii) sensed radar data, detects another vehicle that is present exterior of the equipped vehicle; wherein said vehicular control system is operable to wirelessly communicate information to a remote receiving device that is remote from the equipped vehicle; and wherein information wirelessly communicated to said remote receiving device comprises information derived, at least in part, from image data captured by at least said forward-viewing camera.

47. The vehicular control system of claim 46, wherein information wirelessly communicated to said remote receiving device comprises data relevant to the geographic location of the equipped vehicle.

48. The vehicular control system of claim 47, wherein said control tracks edges detected via said edge detection algorithm over multiple frames of captured image data.

49. The vehicular control system of claim 47, wherein said driver side-viewing camera is mounted at a driver-side exterior mirror assembly of the equipped vehicle, and wherein said passenger side-viewing camera is mounted at a passenger-side exterior mirror assembly of the equipped vehicle, and wherein said control receives image data captured by a rear-viewing camera mounted at a rear portion of the equipped vehicle and having a field of view at least rearward of the equipped vehicle, and wherein image data captured by said rear-viewing camera at the rear portion of the equipped vehicle and by one or more of said driver side-viewing camera at the driver-side exterior mirror assembly of the equipped vehicle and said passenger side-viewing camera at the passenger-side exterior mirror assembly of the equipped vehicle is provided to a panoramic vision system of the equipped vehicle.

50. The vehicular control system of claim 49, wherein said control processes captured image data and sensed radar data for an adaptive cruise control system of the equipped vehicle.

51. The vehicular control system of claim 50, wherein, responsive at least in part to processing at said control of captured image data detecting a curve in a road ahead of the equipped vehicle, speed of the equipped vehicle is reduced to an appropriate speed for traveling around the detected curve, and wherein speed of the equipped vehicle increases after travelling around the detected curve to a speed appropriate for travelling along a generally straight section of road that comes after the curve.

52. The vehicular control system of claim 50, wherein said control, responsive to determination, based at least in part on processing of captured image data at said control, that the equipped vehicle is unintentionally drifting out of a traffic lane that the equipped vehicle is currently travelling in, controls a steering system of the equipped vehicle to adjust steering of the equipped vehicle to mitigate such drift out of the traffic lane the equipped vehicle is travelling in.

53. The vehicular control system of claim 50, wherein said control, based at least in part on said detection of the other vehicle present exterior of the equipped vehicle and said determination of distance from the equipped vehicle to the detected other vehicle, determines whether it is safe for the equipped vehicle to execute a lane change maneuver.

54. The vehicular control system of claim 53, wherein the steering system is manually controllable irrespective of said control.

55. The vehicular control system of claim 47, wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a passenger side-sensing radar sensor mounted at a passenger-side portion of the equipped vehicle, said passenger side-sensing radar sensor having a field of sensing at least sideward and rearward of the equipped vehicle, and wherein said at least one radar sensor having a field of sensing exterior of the equipped vehicle comprises a forward-sensing radar sensor mounted at a front portion of the equipped vehicle, said forward-sensing radar sensor having a field of sensing at least forward of the equipped vehicle.

56. The vehicular control system of claim 46, wherein, responsive at least in part to processing of captured image data at said control detecting a road condition on a road being traveled by the equipped vehicle, speed of the equipped vehicle is adjusted in accordance with the detected road condition.

57. The vehicular control system of claim 46, wherein, responsive at least in part to processing of captured image data at said control, speed of the equipped vehicle is adjusted in accordance with a traffic condition detected by said vehicular control system.

58. The vehicular control system of claim 46, wherein said control, responsive to processing at said control of image data captured at least by said forward-viewing camera, detects lane markers within the exterior field of view of at least said forward-viewing camera on a road being traveled by the equipped vehicle.

59. The vehicular control system of claim 58, wherein said control, responsive to processing at said control of image data captured at least by said forward-viewing camera, detects road edges within the exterior field of view of at least said forward-viewing camera on the road being traveled by the equipped vehicle.

60. The vehicular control system of claim 58, wherein said control, based at least in part on processing of captured image data and sensed radar data, detects the other vehicle that is present exterior of the equipped vehicle.


ОПИСАНИЕ




ОБЛАСТЬ ТЕХНИКИ, К КОТОРОЙ ОТНОСИТСЯ ИЗОБРЕТЕНИЕ



The present invention relates generally to vision or imaging systems for vehicles and is related to object detection systems and, more particularly, to imaging systems which are operable to determine if a vehicle or object of interest is adjacent to, forward of or rearward of the subject vehicle to assist the driver in changing lanes or parking the vehicle. The present invention also relates generally to a lane departure warning system for a vehicle.


ПРЕДПОСЫЛКИ СОЗДАНИЯ ИЗОБРЕТЕНИЯ



Many lane change aid/side object detection/lane departure warning devices or systems and the like have been proposed which are operable to detect a vehicle or other object that is present next to, ahead of or rearward of the equipped vehicle or in an adjacent lane with respect to the equipped vehicle. Such systems typically utilize statistical methodologies to statistically analyze the images captured by a camera or sensor at the vehicle to estimate whether a vehicle or other object is adjacent to the equipped vehicle. Because such systems typically use statistical methodologies to determine a likelihood or probability that a detected object is a vehicle, and for other reasons, the systems may generate false positive detections, where the system indicates that a vehicle is adjacent to, forward of or rearward of the subject vehicle when there is no vehicle adjacent to, forward of or rearward of the subject vehicle, or false negative detections, where the system, for example, indicates that there is no vehicle adjacent to the subject vehicle when there actually is a vehicle in the adjacent lane.

Such known and proposed systems are operable to statistically analyze substantially all of the pixels in a pixelated image as captured by a pixelated image capture device or camera. Also, such systems may utilize algorithmic means, such as flow algorithms or the like, to track substantially each pixel or most portions of the image to determine how substantially each pixel or most portions of the image has changed from one frame to the next. Such frame by frame flow algorithms and systems may not be able to track a vehicle which is moving at generally the same speed as the equipped vehicle, because there may be little or no relative movement between the vehicles and, consequently, little or no change from one frame to the next. Because the systems may thus substantially continuously analyze substantially every pixel for substantially every frame captured and track such pixels and frames from one frame to the next, such systems may require expensive processing controls and computationally expensive software to continuously handle and process substantially all of the data from substantially all of the pixels in substantially each captured image or frame.

Many automotive lane departure warning (LDW) systems (also known as run off road warning systems) are being developed and implemented on vehicles today. These systems warn a driver of a vehicle when their vehicle crosses the road's land markings or when there is a clear trajectory indicating they will imminently do so. The warnings are typically not activated if the corresponding turn signal is on, as this implies the driver intends to make a lane change maneuver. Additionally, the warning systems may be deactivated below a certain vehicle speed. The driver interface for these systems may be in the form of a visual warning (such as an indicator light) and/or an audible warning (typically a rumble strip sound). One application warns a driver with an indicator light if the vehicle tire is crossing the lane marker and no other vehicle is detected in the driver's corresponding blind spot; and/or further warns the driver with an audible warning if the vehicle is crossing into the adjacent lane and there is a vehicle detected in the driver's blind spot.

There is concern that the current systems will be more of a driver annoyance or distraction than will be acceptable by the consumer market. Using the turn signal as the principle means of establishing to the warning system that the maneuver is intentional does not reflect typical driving patterns and, thus, many intended maneuvers will cause a warning. As a driver gets annoyed by warnings during intended maneuvers, the driver will likely begin to ignore the warnings, which may result in an accident when the warning is appropriate.

Therefore, there is a need in the art for an object detection system, such as a blind spot detection system or lane change assist system or lane departure warning system or the like, which overcomes the short comings of the prior art.

SUMMARY OF THE PRESENT INVENTION

The present invention is intended to provide an object detection system, such as a blind spot detection system, a lane change assist or aid system or device, a lane departure warning system, a side object detection system, a reverse park aid system, a forward park aid system, a forward, sideward or rearward collision avoidance system, an adaptive cruise control system, a passive steering system or the like, which is operable to detect and/or identify a vehicle or other object of interest at the side, front or rear of the vehicle equipped with the object detection system. The object detection system of the present invention, such as a lane change assist system, utilizes an edge detection algorithm to detect edges of objects in the captured images and determines if a vehicle is present in a lane adjacent to the equipped or subject vehicle in response to various characteristics of the detected edges, such as the size, location, distance, intensity, relative speed and/or the like. The system processes a subset of the image data captured which is representative of a target zone or area of interest of the scene within the field of view of the imaging system where a vehicle or object of interest is likely to be present. The system processes the detected edges within the image data subset to determine if they correspond with physical characteristics of vehicles and other objects to determine whether the detected edge or edges is/are part of a vehicle or a significant edge or object at or toward the subject vehicle. The system utilizes various filtering mechanisms, such as algorithms executed in software by a system microprocessor, to substantially eliminate or substantially ignore edges or pixels that are not or cannot be indicative of a vehicle or significant object to reduce the processing requirements and to reduce the possibility of false positive signals.

In accordance with the present invention, portions or subsets of the image data of the captured image which are representative of areas of interest of the exterior scene where a vehicle or object of interest is likely to be present are weighted and utilized more than other portions or other subsets of the image data of the captured image representative of other areas of the exterior scene where such a vehicle or object of interest is unlikely to be present. Thus, in accordance with the present invention, a reduced set or subset of captured image data is processed based on where geographically vehicles of interest are realistically expected to be in the field of view of the image capture device. More specifically, for example, the control may process and weight the portion of the captured image data set that is associated with a lower portion of the image capture device field of view that is typically directed generally toward the road surface. Preferably, less than approximately 75% of the image data captured by the multi-pixel camera arrangement is utilized for object detection, more preferably, less than approximately 66% of the image data captured by the multi-pixel camera arrangement is utilized for object detection, and most preferably, less than approximately 50% of the image data captured by the multi-pixel camera arrangement is utilized for object detection.

It is further envisioned that the control may process or weight image data within the reduced set or subset which is indicative of physical characteristics of a vehicle or object of interest more than other image data within the reduced set which is not likely indicative of or cannot be indicative of such a vehicle or object of interest. The control thus may further reduce the processing requirements within the reduced set or sets of image data of the captured image.

Preferably, a multi-pixel array is utilized, such as a CMOS sensor or a CCD sensor or the like, such as disclosed in commonly assigned U.S. Pat. Nos. 5,550,677; 5,670,935; 5,796,094 and 6,097,023, and U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149, which are hereby incorporated herein by reference, or such as an extended dynamic range camera, such as the types disclosed in U.S. provisional application Ser. No. 60/426,239, filed Nov. 14, 2002, which is hereby incorporated herein by reference. Because a multi-pixel array is utilized, the image or portion of the image captured by a particular pixel or set of pixels may be associated with a particular area of the exterior scene and the image data captured by the particular pixel or set of pixels may be processed accordingly.

According to an aspect of the present invention, an object detection system for a vehicle comprises a pixelated imaging array sensor and a control. The imaging array sensor is directed generally exteriorly from the vehicle to capture an image of a scene occurring exteriorly, such as toward the side, front or rear, of the vehicle. The control comprises an edge detection algorithm and is responsive to an output of the imaging array sensor in order to detect edges of objects present exteriorly of the vehicle. The control is operable to process and weight and utilize a reduced image data set or subset representative of a target area of the exterior scene more than other image data representative of other areas of the exterior scene. The target area or zone comprises a subset or portion of the image captured by the imaging array sensor and is representative of a subset or portion of the exterior scene within the field of view of the imaging array sensor. The control thus processes a reduced amount of image data and reduces processing of image data that is unlikely to indicate a vehicle or other object of interest. The imaging array sensor may be directed partially downwardly such that an upper portion of the captured image is generally at or along the horizon.

The control may be operable to process portions of the captured image representative of a target area of the scene and may reduce processing or reduce utilization of other portions of the captured image representative of areas outside of the target area and, thus, reduce the processing of edges or pixels which detect areas where detected edges are likely indicative of insignificant objects or which are not or cannot be indicative of a vehicle or significant object. The control is thus operable to process and weight and utilize image data from certain targeted portions of the captured image more than image data from other portions which are outside of the targeted portions or the target zone or area of interest.

The control may determine whether the detected edges within the target area are part of a vehicle in an adjacent lane in response to various characteristics of the detected edges which may be indicative of a vehicle or a significant object. For example, the control may be operable to process certain areas or portions or subsets of the captured image data or may be operable to process horizontal detected edges and filter out or substantially ignore vertical detected edges. The control may also or otherwise be operable to process detected edges which have a concentration of the edge or edges in a particular area or zone within the captured image. The control thus may determine that one or more detected edges are part of a vehicle in the adjacent lane in response to the edges meeting one or more threshold levels. Also, the control may adjust the minimum or maximum threshold levels in response to various characteristics or driving conditions or road conditions. For example, the control may be operable to process or substantially ignore detected edges in response to at least one of a size, location, intensity, distance, and/or speed of the detected edges relative to the vehicle, and may adjust the minimum or maximum threshold levels or criteria in response to a distance between the detected edges and the subject vehicle, a road curvature, lighting conditions and/or the like.

According to another aspect of the present invention, an imaging system for a vehicle comprises an imaging array sensor having a plurality of photo-sensing or accumulating or light sensing pixels, and a control responsive to the imaging array sensor. The imaging array sensor is positioned at the vehicle and operable to capture an image of a scene occurring exteriorly of the vehicle. The control is operable to process the captured image, which comprises an image data set representative of the exterior scene. The control is operable to apply an edge detection algorithm to the image captured by the imaging array sensor to detect edges or objects present exteriorly of the vehicle. The control may be operable to determine whether the detected edges or objects are indicative of a significant object or object of interest. The control is operable to process a reduced data set or subset of the image data set, which is representative of a target zone or area of the exterior scene, more than other image data representative of areas of the exterior scene which are outside of the target zone. The control thus may process image data of the reduced data set or subset, such as by applying an edge detection algorithm to the reduced data set, and substantially discount or limit processing of the other image data which is outside of the reduced data set or subset of the image or of the target zone of the exterior scene.

The control may be operable to adjust the reduced data set or subset and the corresponding target zone in response to various threshold criterion. The control may be operable to adjust the reduced data set or target zone in response to a distance to a detected edge or object. The control may approximate a distance to a portion of a detected edge or object in response to a location of the pixel or pixels capturing the portion in the captured image. The pixel location may be determined relative to a target pixel which may be directed generally at the horizon and along the direction of travel of the vehicle. For example, the control may be operable to approximate the distance using spherical trigonometry in response to a pixel size, pixel resolution and field of view of the imaging array sensor. The control may access an information array which provides a calculated distance for each pixel within the reduced data set or target zone to approximate the distance to the portion of the detected edge or object.

In order to determine if a detected edge or detected edges is/are part of or indicative of a vehicle, the control may be operable to determine if the detected edge or edges is/are associated with an ellipse or partial ellipse, since the ellipse or partial ellipse may be indicative of a tire of a vehicle near the equipped vehicle, such as a vehicle in a lane adjacent to the equipped vehicle. The control may also be operable to track one or more of the detected edges between subsequent frames captured by the imaging array sensor to classify and/or identify the object or objects associated with the detected edge or edges.

The object detection system or imaging system may comprise a lane change assist system operable to detect vehicles or objects of interest sidewardly of the vehicle. Optionally, the control may be in communication with a forward facing imaging system. The forward facing imaging system may communicate at least one of oncoming traffic information, leading traffic information and lane marking information to the control of the lane change assist system to assist the lane change assist system in readily identifying vehicles at the side of the subject vehicle or adjusting a reduced data set or an area or zone of interest within the captured image. The control may be operable to adjust the reduced data set or target zone in response to the forward facing imaging system.

Optionally, the object detection system or imaging system may comprise a forward facing imaging system, such as a lane departure warning system. The lane departure warning system may provide a warning or alert signal to the driver of the vehicle in response to a detection of the vehicle drifting or leaving its occupied lane.

Optionally, the forward facing imaging system may include or may be in communication with a passive steering system which is operable to adjust a steering direction of the vehicle in response to a detection by the imaging system of the vehicle drifting or leaving its occupied lane. Optionally, the forward facing imaging system may include or may be in communication with an adaptive speed control which is operable to adjust a cruise control or speed setting of the vehicle in response to road conditions or traffic conditions detected by the imaging system. Optionally, the imaging system may be in communication with a remote receiving device to provide image data to a display system remote from the vehicle such that a person remote from the vehicle may receive and view the image data with the remote receiving device to determine the location and/or condition of the vehicle or its occupants.

According to another aspect of the present invention, a lane change assist system for a vehicle comprises an imaging sensor and a control. The imaging sensor is positioned at the vehicle and directed generally sidewardly from the vehicle to capture an image of a scene occurring toward the side of the vehicle. The control is operable to process the image captured by the imaging array sensor to detect objects sidewardly of the vehicle. The captured image comprises an image data set representative of the exterior scene. The control is operable to process a reduced image data set of the image data set more than other image data of the image data set. The reduced image data set is representative of a target zone of the captured image.

The control may be operable to adjust the reduced data set or subset or target zone in response to an adjustment input. In one form, the adjustment input comprises an output from an ambient light sensor, a headlamp control and/or a manual control. The control may be operable to adjust the reduced data set or subset or target zone between a daytime zone and a nighttime zone in response to the output. The control may be operable to adjust a height input for the imaging array sensor such that the daytime zone is generally along the road surface and the nighttime zone is generally at a height of headlamps of vehicles.

In another form, the control may be operable to adjust the reduced data set or subset or target zone in response to a detection of the vehicle traveling through or along a curved section of road. The adjustment input may comprise an output from a forward facing imaging system or a detection by the imaging sensor and control that the vehicle is traveling through a curved section of road, such as by the imaging sensor and control detecting and identifying curved lane markers or the like along the road surface.

It is further envisioned that many aspects of the present invention are suitable for use in other vehicle vision or imaging systems, such as other side object detection systems, forward facing vision systems, such as lane departure warning systems, forward park aid systems or the like, rearward facing vision systems, such as back up aid systems or rearward park aid systems or the like, or panoramic vision systems and/or the like.

The present invention may also or otherwise provide a lane departure warning system that reduces and may substantially eliminate the provision of an unwarranted and/or unwanted visual or audible warning signals to a driver of a vehicle when the driver intends to perform the driving maneuver.

According to another aspect of the present invention, a lane departure warning system includes an imaging sensor mounted at a forward portion of a vehicle and operable to capture an image of a scene generally forwardly of the vehicle, and a control for providing a warning signal to a driver of the vehicle in response to an image captured by the imaging sensor. The control is operable to process the image captured to detect at least one of a lane marking, a road edge, a shoulder edge and another vehicle or object. The lane departure warning system provides the warning signal in response to a detected object or marking and further in response to the vehicle speed or other parameters which provide additional information concerning the likelihood that a warning signal is necessary.

Therefore, the present invention provides an object detection system or imaging system, such as a lane change assist system or other type of object detection or imaging system, which is operable to detect and identify vehicles or other objects of interest exteriorly, such as sidewardly, rearwardly, and/or forwardly of the subject vehicle. The imaging system may primarily process image data within a reduced data set or subset of the captured image data, where the reduced data set is representative of a target zone or area of interest within the field of view of the imaging system, and may adjust the reduced data set or zone or area in response to various inputs or characteristics, such as road conditions, lighting or driving conditions and/or characteristics of the detected edges or objects. The imaging system of the present invention is operable to detect edges of objects, and particularly horizontal edges of objects, to provide improved recognition or identification of the detected objects. The imaging system of the present invention may be operable to limit processing of or to filter or substantially eliminate or reduce the effect of edges or characteristics which are indicative of insignificant objects, thereby reducing the level of processing required on the captured images.

The edge detection process or algorithm of the lane change assist system of the present invention thus may provide for a low cost processing system or algorithm, which does not require the statistical methodologies and computationally expensive flow algorithms of the prior art systems. Also, the edge detection process may detect edges and objects even when there is little or no relative movement between the subject vehicle and the detected edge or object. The present invention thus may provide a faster processing of the captured images, which may be performed by a processor having lower processing capabilities then processors required for the prior art systems. The lane change assist system may also provide a low cost and fast approximation of a longitudinal and/or lateral and/or total distance between the subject vehicle and a detected edge or object exteriorly of the vehicle and may adjust a threshold detection level in response to the approximated distance.

These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



FIG. 1 is a top plan view of a vehicle equipped with a lane change assist system in accordance with the present invention, as the vehicle travels along a section of road;

FIG. 2 is a side elevation of the vehicle of FIG. 1;

FIG. 3 is a schematic of a vehicle equipped with the lane change assist system of the present invention as the vehicle travels along a section of road;

FIG. 4 is another schematic of the vehicle and depicts how the lane change assist system of the present invention adapts between daytime and nighttime driving conditions;

FIG. 5 is a perspective view of a vehicle as it may be viewed by a camera or image sensor of the lane change assist system of the present invention;

FIG. 6 is a top plan view of a vehicle equipped with the lane change assist system of the present invention, as the vehicle travels around a sharp curve in a section of road;

FIG. 7 is a schematic of a pixelated image as may be captured by a camera or image sensor of a lane change assist system in accordance with the present invention;

FIG. 8 is a schematic of another pixelated image similar to FIG. 7;

FIG. 9 is a block diagram of a top plan view of a vehicle equipped with the lane change assist system of the present invention and another vehicle as they travel along a section of road in adjacent lanes to one another;

FIG. 10 is a block diagram of a lane change assist system in accordance with the present invention;

FIGS. 11A-F are diagrams of a virtual camera and characteristics thereof useful in calculating a distance from the camera of the lane change assist system to an object detected in the field of view of the camera;

FIG. 12 is a top plan view of a vehicle driving along a road and incorporating a lane departure warning system of the present invention; and

FIG. 13 is another top plan view of the vehicle driving along a road, with another vehicle in an adjacent lane.


ОПИСАНИЕ ПРЕДПОЧТИТЕЛЬНЫХ ВАРИАНТОВ ОСУЩЕСТВЛЕНИЯ



Referring now to the drawings and the illustrative embodiments depicted therein, an object detection system or imaging system, such as a lane change assist or aid system 10, is positioned at a vehicle 12 and is operable to capture an image of a scene occurring sidewardly and rearwardly at or along one or both sides of vehicle 12 (FIGS. 1-4 and 6). Lane change assist system 10 comprises an image capture device or sensor or camera 14 and a control 16 (FIGS. 3, 9 and 10). Camera 14 captures an image of the scene occurring toward a respective side of the vehicle 12, and control 16 processes the captured image to determine whether another vehicle 18 is present at the side of vehicle 12, as discussed below. Control 16 may be further operable to activate a warning indicator or display or signal device 17 (FIG. 10) to alert the driver of vehicle 12 that another vehicle is present at the side of vehicle 12. The warning or alert signal may be provided to the driver of vehicle 12 in response to another vehicle being detected at the blind spot area (as shown in FIG. 1) and may only be provided when the driver of vehicle 12 actuates a turn signal toward that side or begins turning the subject vehicle 12 toward that side to change lanes into the lane occupied by the other detected vehicle 18.

Camera or imaging sensor 14 of object detection system or lane change assist system 10 is operable to capture an image of the exterior scene within the field of view of the camera. The captured image comprises an image data set, which is representative of the exterior scene, and which is received by control 16. Control 16 is operable to process image data within a reduced data set or subset of the image data set more than other image data of the image data set to reduce the processing requirements of the control. The reduced data set or subset or subsets is/are representative of a target zone or area or areas in the exterior scene where a vehicle or other object of interest may realistically be expected to be present within the exterior scene. The control is thus operable to primarily process the significant or relevant area or areas of the scene more than less relevant areas, and may limit or reduce processing of or substantially ignore the image data representative of some areas of the exterior scene where it is not likely that a vehicle or other object of interest would be present or where a vehicle cannot be present.

Camera or imaging sensor 14 may comprise an imaging array sensor, such as a CMOS sensor or a CCD sensor or the like, such as disclosed in commonly assigned U.S. Pat. Nos. 5,550,677; 5,670,935; 5,796,094 and 6,097,023, and U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149, which are hereby incorporated herein by reference, or an extended dynamic range camera, such as the types disclosed in U.S. provisional application Ser. No. 60/426,239, filed Nov. 14, 2002, which is hereby incorporated herein by reference. The imaging sensor 14 may be implemented and operated in connection with other vehicular systems as well, or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. No. 5,796,094, which is hereby incorporated herein by reference, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454 and/or 6,320,176, which are hereby incorporated herein by reference, a vehicle vision system, such as a forwardly or sidewardly or rearwardly directed vehicle vision system utilizing the principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935 and 6,201,642, and/or in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, which are hereby incorporated herein by reference, a traffic sign recognition system, a system for determining a distance to a leading vehicle or object, such as utilizing the principles disclosed in U.S. Pat. No. 6,396,397, which is hereby incorporated herein by reference, and/or the like.

Camera 14 preferably comprises a pixelated imaging array sensor which has a plurality of photon accumulating light sensors or pixels 14a. The camera includes circuitry which is operable to individually access each photosensor pixel or element of the array of photosensor pixels and to provide an output or image data set associated with the individual signals to the control 16, such as via an analog to digital converter (not shown). As camera 14 receives light from objects and/or light sources in the target scene, the control 16 may then be operable to process the signal from at least some of the pixels to analyze the image data of the captured image, as discussed below.

Camera 14 may be positioned along one or both sides of vehicle 12, such as at or within the exterior rearview mirror 12a at either or both sides of vehicle 12. However, camera 14 may be positioned elsewhere along either or both sides and/or at the rear of the vehicle and directed sidewardly and rearwardly from the vehicle to capture an image at either side of the vehicle, without affecting the scope of the present invention. Camera 14 may be positioned at vehicle 12 and oriented or angled downwardly so as to capture an image which has an upper edge or region generally at the horizon 15, as can be seen with reference to FIGS. 2, 3 and 11C. Positioning or orienting the camera 14 in such a manner provides for an increased horizontal pixel count across the captured image at the important areas along the side of vehicle 12, since any vehicle or significant object positioned at or along a side of the subject vehicle will be substantially below the horizon and thus substantially within the captured image. The lane change assist system of the present invention thus may provide an increased portion of the captured image or increased pixel count at important or significant or relevant areas of the exterior scene, since the area well above the road or horizon is not as significant to the detection of a vehicle at or along a side of the subject vehicle. Additionally, positioning the camera to be angled generally downwardly also reduces the adverse effects that the sun and/or headlamps of other vehicles may have on the captured images. Camera 14 thus may be operable to capture substantially an entire image of the sideward scene below the horizon.

Control 16 is responsive to camera 14 and processes the signals received from at least some of the pixels of camera 14 to determine what is in the captured image. The present invention utilizes physical characteristics of vehicles and roads to reduce or filter out or substantially eliminate the signals from some of the pixels and to reduce or eliminate signals or detected images indicative of certain insignificant or unimportant objects detected within the captured image, as discussed below. For example, control 16 may primarily process the image data from pixels of camera 14 that are within a reduced data set or subset of the image data of the captured image. The reduced data set of the captured image may be representative of a targeted area or zone of interest of the exterior scene being captured by the camera. The targeted zone may be selected because it encompasses a geographic area of the exterior scene where a vehicle or other object of interest is likely to be present, while the other image data or areas or portions of the captured image may be representative of areas in the exterior scene where a vehicle or other object of interest is unlikely to be present or cannot be present, as discussed below. The present invention thus may provide for a quicker response time by control 16, since the control 16 does not continually process the signals from substantially all of the pixels 14a of camera 14. Preferably, less than approximately 75% of the image data captured by the camera is utilized for object detection, more preferably, less than approximately 66% of the captured image data is utilized for object detection, and most preferably, less than approximately 50% of the captured image data is utilized for object detection.

Control 16 may include a microprocessor having an edge detection algorithm or function 16a (FIG. 10) which is operable to process or is applied to the image data received from the individual pixels to determine whether the image captured by the pixels defines an edge or edges of a significant object, such as an edge or edges associated with or indicative of a bumper 18a of a vehicle 18 or the like. The edge detection function or algorithm 16a of control 16 allows lane change assist system 10 to interrogate complex patterns in the captured image and separate out particular patterns or edges which may be indicative of a vehicle in the adjacent lane, and to substantially ignore or limit processing of other edges or patterns which are not or cannot be indicative of a vehicle and thus are insignificant to lane change assist system 10. Other information or image data in the captured image or frame which is not associated with edges or which is not associated with significant edges (e.g. edges indicative of a portion of a vehicle), may then be substantially ignored or filtered out by control 16 via various filtering processes or mechanisms discussed below to reduce the information or image data being processed by control 16 and to reduce the possibility of a false positive detection by control 16. The edge detection function or algorithm 16a may comprise a Sobel gradient edge detection algorithm or other edge detection algorithms commercially available, and such as disclosed in U.S. Pat. Nos. 6,353,392 and 6,313,454, which are hereby incorporated herein by reference.

Control 16 may be operable to determine which edges detected are horizontal or generally horizontal edges and to limit processing of or to partially filter out or substantially ignore vertical edges. This may be preferred, since many edges in a vehicle in an adjacent lane will be horizontal or parallel to the road surface, such as edges associated with bumper lines, grills, fenders, and/or the like. Control 16 may thus reject or substantially ignore edges which are non-horizontal, thereby reducing the data to be processed. The edge detection algorithm 16a may also provide digital polarization of the captured images to determine horizontal gradients and to substantially ignore the effects of vertical gradients of the detected edges. For example, the edge detection algorithm may use a convolution matrix (such as a one by three matrix or other small matrix or array) which may be processed or applied to the image data in a single pass across the data received from the pixels 14a of the camera 14 to provide horizontally polarized edge detection through the captured image or a portion thereof. Such horizontal polarization greatly reduces the possibility that road signs and/or guardrails and/or the like will be processed and analyzed by the control of the lane change assist system of the present invention, thereby reducing the processing requirements and reducing the possibility of a false positive signal by the control.

Additionally, the edge detection algorithm 16a of control 16 may function to detect and determine if there is more than one vehicle present at the side of the subject vehicle 12. For example, control 16 may distinguish between edges constituting the fronts of different vehicles and edges constituting the front and side of the same vehicle, since the vehicle fronts typically will have more horizontal edges than the vehicle sides.

In order to further reduce the processing requirements and the possibility of a false positive indication, and thus enhance the response time and system performance, control 16 may process signals or image data from pixels that are oriented or targeted or arranged or selected to capture images of objects or items that are at least partially positioned within a predetermined or targeted area or zone of interest. The zone of interest may be defined by an area or region at the side of the subject vehicle where another vehicle or significant object may be positioned, such as in the blind spot region of that side of the vehicle, which would be significant or important to lane change assist system 10. For example, the zone of interest or "polygon of interest" may be directed rearward from the camera and toward or around the center of the adjacent lane. By substantially isolating the zone of interest, or substantially filtering out or substantially ignoring or reducing utilization of edges or signals or image data of the captured image which are representative of areas outside of the zone or area of interest, the system of the present invention may reduce the image data or information to be processed by control 16 and may substantially reduce the possibility that a false positive signal will occur. For example, if an object is detected substantially to one side or the other or substantially at the bottom of the captured image, such an object is not likely to be a vehicle positioned within the blind spot area of the subject vehicle 12, whereby control 16 may reduce processing of or may not process image data from the pixels capturing that area of the scene or may substantially ignore such a detected edge or object in subsequent processing of the image data captured by the pixels 14a of camera 14.

It is further envisioned that control 16 may process the image data of pixels capturing images representative of an area within the zone of interest and may not indicate a positive signal of a vehicle or other significant object in the adjacent lane unless a detected edge within the reduced image data set or subset or zone of interest is greater than a minimum size threshold, or spans a threshold number of pixels. Optionally, control 16 may require that a detected edge span or include a threshold number of pixels that are within a predetermined "hot zone" or specific targeted area within the zone of interest before the edge will be considered significant for further processing. The targeted zone or hot zone may be defined as a reduced zone or area near the center of the zone of interest or targeted road space or adjacent lane. The control 16 thus may require a substantial portion of the detected edge or edges to be within the smaller hot zone before the control may consider the edges to constitute a portion of a vehicle in the adjacent lane or other significant object. This also may substantially reduce the processing requirements and may substantially reduce the possibility of a false positive signal being generated by control 16.

The reduced image data set of the captured image which is representative of the zone of interest of the exterior scene may be adjusted by control 16 in response to various road or driving conditions, lighting conditions, and/or characteristics of the detected edges or objects. The reduced data set or zone of interest thus may be adaptable to various conditions encountered by the vehicle, such that the control may further reduce the processing requirements and enhance the efficiency of the system by primarily processing image data from some pixels and ignoring image data from other pixels depending on the present conditions surrounding the vehicle.

For example, as shown in FIG. 4, control 16 may be operable to adjust or adapt the image data subset or zone or area of interest between daytime and nighttime driving conditions. During daytime driving conditions, detecting the edge of the front horizontal shadow 18b (FIG. 4) of a vehicle 18 or the bumper 18b of a vehicle 18 may be the method for significant object or vehicle detection by the lane change assist system of the present invention. However, during nighttime driving conditions, where such vehicle characteristics may not be visible to the camera 14, the primary detection characteristic may be the headlights 18c of a vehicle 18 approaching from the rear of the subject vehicle. Control 16 may thus adjust or adapt the reduced data set or target zone in response to an output or signal from an ambient light sensor (which detects the ambient light intensity present at or around the subject vehicle), a headlamp control, a headlamp switch, a manual control or input and/or the like (shown generally at 20 in FIG. 10). More particularly, the reduced data set or zone of interest may be raised to correspond to the typical height or range of height of a headlight of a typical vehicle, so that control 16 may primarily process image data from pixels which receive light from headlamps of vehicles in the adjacent lane.

As shown in FIG. 4, the adjustment of the reduced data set or zone may be adjusted mathematically by changing the height (.gamma., .gamma.') of the camera as input to the control (such as between a daytime camera height shown generally at .gamma. and a nighttime camera height shown generally at .gamma.'), such that all of the geometry of the zone of interest is adjusted upward. Because headlights of vehicles are generally within a certain distance or range above the road surface, the control may be operable to adjust the reduced data set or zone of interest to adapt to this geometric change in the detection characteristic. A daytime perspective triangle associated with the camera is shown generally at D in FIG. 4, while a nighttime perspective triangle associated with the camera is shown generally at N in FIG. 4.

It is further envisioned that the reduced data set or area or zone of interest may be changed or adapted to accommodate sharp curves in the road that the subject vehicle 12 is traveling through or has traveled through. In situations where a vehicle travels along a sharp curve in the road, a lane change assist system may consider a guardrail or vehicle 18' in another lane to be a vehicle or object of interest in the adjacent lane, since the other vehicle or object may be positioned generally at or near the zone of interest of the lane change assist system, as can be seen in FIG. 6. Control 16 may be operable to process the image data or signals from the pixels 14a of camera 14 to determine lane markers along the road, or a shoulder of the road or the like, in order to determine the road curvature as the vehicle 12 travels along the section of road. In situations where a sharp curve in the road is detected, control 16 may be operable to alter or reshape the reduced data set or area or zone of interest and/or to adjust the distance thresholds (discussed below) or to adjust other filtering characteristics or criteria or thresholds to accommodate such a curve in the road. The lane change assist system of the present invention thus may use road curvature information to adjust how far back and/or where the camera and/or control may look for significant objects or vehicles. The lane change assist system thus substantially avoids providing a false positive signal upon detection of another vehicle 18' or guardrail or the like which is not in the adjacent lane, since such a vehicle or object may not be within the adjusted zone of interest of the lane change assist system.

Optionally, control 16 may be further operable to substantially eliminate or substantially ignore image data representative of objects or edges which are too large or too small to be considered part of a vehicle in the adjacent lane. If a detected edge is too small, such as if the horizontal pixel span or vertical pixel span is very small, the control may reduce processing of the edge or the edge may be removed from further processing, since it does not represent a significant edge to the lane change assist system 10. Likewise, if an edge is too large, the control may reduce processing of the edge or it may also be removed from further processing since it does not represent a vehicle in the adjacent lane. The threshold size of the detected edge or object may also vary in response to the distance to the edge or object, as discussed below.

Additionally, lane change assist system 10 may be operable to determine whether a detected edge or object is a vehicle in an adjacent lane in response to one or more other detection thresholds or criteria. Further, control 16 may be operable to vary one or more detection thresholds or criteria at which a detected edge or object is considered a vehicle or significant object. The threshold values may thus be variable and may be adjusted in response to driving conditions, road curvature, location of the detected edges and/or the distance between the camera and the detected object and/or the like. For example, the threshold value or values may be adjusted in response to the distance so that control 16 more readily accepts and processes detected edges as the object they are representative of gets closer to or approaches the subject vehicle.

For example, control 16 may have a minimum gradient threshold at which control 16 determines whether or not a detected edge is to be included in further processing of the captured image. Control 16 thus may be operable to determine the vertical and/or horizontal gradient of the detected edges and may substantially eliminate or filter out edges with a gradient below a threshold gradient level, since such edges cannot be representative of a vehicle or object which is significant to the lane change assist system. The control thus may further substantially preclude false positive signals and reduce further processing of the pixel signals.

However, as an object or other vehicle approaches the subject vehicle 12, the detected edge or edges representative of the object tends to resolve or reduces and spreads out the gradient across multiple pixels, thereby reducing the gradient at a particular pixel. Control 16 thus may be further operable to adjust the minimum gradient threshold in response to the distance to the detected object. By using a calculated or estimated or approximated distance to the detected object or a table of perspective calculations or distance approximations, discussed below, the minimum gradient threshold may be reduced proportionally in response to the estimated or tabulated distance data to provide enhanced edge detection at closer ranges.

By detecting edges of objects within the reduced data set or zone or area of interest (and adjusting the zone of interest for particular driving conditions or situations), and by focusing on or concentrating on or primarily processing the horizontal edges detected or other edges which may be indicative of a vehicle or significant object, while substantially filtering out or substantially ignoring other image data or edges or information, the present invention substantially reduces the possibility of false positive signals. In order to further reduce the possibility of such false positive signals, control 16 may be operable to determine a distance between a detected object and the subject vehicle to further filter out or substantially eliminate objects that are not within a predetermined range or threshold distance from the subject vehicle and which, thus, may be insignificant to the lane change assist system of the present invention.

In a preferred embodiment, camera 14 and control 16 may be operable to approximate the distance to an object or vehicle in response to a pixel count of the number of pixels between the pixels capturing the object (or an edge of the object) and the pixels along an edge of the camera or directed toward and along the horizon of the captured image. More particularly, with the camera 14 oriented with the video frame horizontal scan lines or pixels being generally parallel to the horizon, perspective calculations may be made to provide a table of entries of particular distances which correspond to particular horizontal lines or pixels in the video frame which may detect or sense a forward edge of an adjacent vehicle at the ground level, such as an edge corresponding to a shadow of the front of the vehicle 18 or an edge corresponding to the intersection of the tire 18d of the vehicle 18 on the road surface or the like. The distance to an object captured by an edge detected in the captured image may then be approximated by determining a vertical pixel count and retrieving the appropriate distance entry corresponding to that particular horizontal line or pixel count or position. The present invention thus provides for a quick and inexpensive means for determining or estimating or approximating the distance between the subject vehicle and an object or edge detected in the area or zone of interest by determining a horizontal line count from the horizon down to the pixels capturing the detected edge.

As can be seen with reference to FIGS. 3 and 7-9 and as discussed below, the location and distance of a closest point .phi. on a detected edge or object relative to camera 14 or subject vehicle 12 may be calculated based on known or assigned parameters of the location of camera 14 and a horizontal and vertical pixel count between a target or alignment point 14b (FIGS. 7 and 8) and the closest point .phi.. This may be accomplished because the lowest detected edge of a vehicle may be considered to be indicative of a forward shadow of the front bumper of the vehicle on the road surface or may be indicative of the intersection of the tire and the road surface. Because such edges are generally at or near the road surface, the distance to the detected object may be calculated using known geometrical equations given the height of the camera on the subject vehicle (as input to the control). Control 16 thus may quickly determine the distance to a detected object and may be easily calibrated for different applications of the lane change assist system. The calculated distances corresponding to at least some of the pixels 14a of camera 14 may be entered into a table or database, such that control 16 may be operable to quickly obtain an estimated distance between the camera and the closest point of a detected edge or object once at least the vertical pixel count between the closest point .phi. and the horizon or target or alignment point 14b is determined.

More particularly, in order to determine the total distance between camera 14 and the closest point of a detected edge or object, the lateral distance .psi. and longitudinal distance .delta. may be calculated and used to obtain the total distance .tau.. Because the lateral distance .psi. should be approximately constant for an edge or vehicle detected in the zone or area corresponding to the adjacent lane, the lane change assist system 10 may only calculate or tabulate and access the longitudinal distance .delta. for the detected edges, whereby the distances may be calculated and tabulated for each horizontal line count down from the horizon or target point. More particularly, the longitudinal distance .delta. may be calculated or approximated by determining a pixel count (Pixels.sub..beta.) downward from the horizon 15 to the detected edge or point .phi.. The pixel count may be used to obtain a value for the downward angle .beta. (FIG. 3) between camera 14 and the detected object, which is derived from the following equation (1): .beta.=Pixels.sub..beta.*v; (1) where v is the vertical view angle per pixel of the camera and is obtained via the following equation (2): v=(Optical Field Height Degrees)/(Vertical Pixel Resolution); (2) where the Optical Field Height Degrees is the vertical angle of view of the camera and the Vertical Pixel Resolution is the number of horizontal rows of pixels of the camera. The downward angle .beta. is then calculated to determine the angle between the horizon and the forward edge of the detected object at the ground. The longitudinal distance .delta. between the vehicles may then be determined or approximated by the following equation (3): .delta.=.gamma.*tan(90.degree.-.beta.); (3) where .gamma. is the height of the camera 14 above the ground as input to the control 16, and as best shown with reference to FIG. 3. As discussed above, the height input to control 16 may be adjusted between .gamma. and .gamma.' (FIG. 4) to adjust the zone of interest for daytime versus nighttime driving conditions. Such an adjustment also adjusts the distance calculations to determine the distance to the detected headlamps, which are above the ground or road surface.

Likewise, if desired, the lateral or sideward location or distance .psi. to the closest point .psi. on the detected edge or object may be calculated by obtaining a horizontal pixel count Pixel.sub..beta.min, such as by counting or determining the pixels or pixel columns from the alignment point 14b horizontally across the captured image to the pixel column corresponding to the closest point .phi.. This pixel count value may be used to calculate the lateral distance to the detected edge or object, which may in turn be used to calculate or estimate the total distance to the detected object. More particularly, the lateral angle .omega. (FIG. 9) between camera 14 at the side of vehicle 12 and the detected object may be determined by the following equation (4): .omega.=Pixel.sub..beta.min*.lamda.; (4) where .lamda. is the horizontal view angle per pixel of the camera and is obtained via the following equation (5): .lamda.=Optical Field Width Degrees/Horizontal Pixel Resolution; (5) where the Optical Field Width Degrees of camera 14 is the angle of view of the camera and the Horizontal Pixel Resolution is the number of columns of pixels of camera 14.

Optionally, the lateral angle .omega. (FIG. 9) between camera 14 at the side of vehicle 12 and the detected object may be determined using spherical trigonometry, which may provide a more accurate lateral angle .omega. determination than equations 4 and 5 above. Using spherical trigonometry, discussed below, or the equations set forth above, a table (image space) of horizontal angles may be produced at initialization or startup of the lane change assist system 10 to determine the horizontal angle for each pixel position on the captured image. Because the horizontal angle is not independent of the vertical angle, an image space may be created and the horizontal view angle of every pixel may be stored therein. An example of such an image space or array is depicted in FIG. 11F.

In determining the perspective geometry, the parameters of a virtual camera 14' are determined or assigned and implemented (see FIGS. 11A-11E). The virtual camera 14' does not actually exist, but calculations may be made to determine an effective focal length (in pixels) of the virtual camera. To work with the perspective geometry, spherical trigonometry may be employed to determine where each pixel on the camera is directed toward. In spherical trigonometry, lateral angles may be calculated based on both horizontal and vertical pixel positions of the detected edge grouping or point. The relationship between horizontal angles and vertical angles may be used to calculate or generate a table of horizontal angles and/or distances to an edge or object detected by each pixel.

The virtual camera geometry may be calculated and used to determine the relationship between each pixel of the captured image and the location on the road surface that the pixel corresponds to. These calculations may be based on an assumption that lines perpendicular to the direction of travel of the subject vehicle may be on a plane which is generally parallel to the horizon and, thus, parallel to the image or pixel lines or rows, since the camera is positioned or oriented such that the horizontal rows of pixels are generally parallel to the horizon. This allows the control to determine the distance along the vehicle forward direction in response to the row of pixels on which the object has been detected, assuming that the camera is detecting an edge of the detected object or other vehicle (such as the front shadow edges, tires or the like) along the pavement or road surface.

An array of pixels 14a' and a focal length (in pixels) vfl of the virtual camera 14' is shown in FIGS. 11A, 11B and 11E. The virtual focal length vfl at the frame center of the virtual camera 14' may be determined by the following equation (6): vfl=(Pixel Resolution/2)/(tan(Frame Angular Size/2)); (6) where the Frame Angular Size is the angular field of view of the camera 14. This equation may be used to calculate the virtual focal length of an imaginary pinhole camera with an infinitely small pinhole lens in vertical pixels vvfl and the virtual focal length in horizontal pixels hvfl using the pixel resolutions and frame angular sizes in the vertical and horizontal directions, respectively. The virtual focal length is calculated in both vertical pixel units and horizontal pixel units because the vertical and horizontal sizes of the pixels may be different and the camera may have a different pixel resolution and frame angular size or field of view between the vertical and horizontal directions.

The vertical or downward view angle .beta. to the object may be determined by the following equation (7): .beta.=arctan(Vertical Pixels)/(vvfl); (7) where Vertical Pixels is the number of pixels or rows of pixels down from the target row or horizon. The view angle thus may be calculated for any line of pixels according to equation (7). An array for each of the view angle values may be calculated and stored for rapid distance calculations. The downward angle .beta. may then be used to calculate the longitudinal distance .delta. in a similar manner as discussed above. As discussed above, the longitudinal distance calculations assume that for a detected edge or object along a row of pixels, the longitudinal distance to the edge or object is the same for any pixel along the row, since the camera is oriented with the rows of pixels being generally parallel to the horizon and generally perpendicular to the direction of travel of the vehicle.

In order to determine the location and angle and distance to a detected object or edge (which may be represented by a point along an object, such as at coordinate x, y of the pixel array (FIG. 11A)), the effective focal length of the virtual camera for the point on the detected object may be calculated. As shown in FIG. 11E, the effective focal length in vertical pixels (vefl) may be calculated by the following equation (8): vefl=(vvfl.sup.2+(y-height/2).sup.2).sup.1/2; (8) where height/2 is one-half of the vertical image height (in pixels) of the camera. The effective focal length in horizontal pixels (hefl) may then be calculated by converting the effective focal length in vertical pixel units to horizontal pixel units via the following equation (9): hefl=hvfl*vefl/vvfl. (9)

The horizontal angle .omega. to the detected point in the image may be calculated via the following equation (10): .omega.=arctan(Horizontal Pixels/hefl); (10) where Horizontal Pixels is the number of columns of pixels (or horizontal distance in pixels) that the point x, y is from the target or alignment or aft point or pixel. The Horizontal Pixels value may be counted or calculated by the control. The calculations for the Horizontal Pixels value may be different for the opposite sides of the vehicle in applications where the zero coordinate of the pixel array may be on the vehicle side of the array for a camera on one side of the vehicle, such as on the passenger side of the vehicle, and may be on the outside of the array for a camera on the other side of the vehicle, such as on the driver side of the vehicle. In the illustrated embodiment of FIG. 11A, the Horizontal Pixels may be calculated by subtracting the x-coordinate for the aft pixel or alignment point 14b' from the x-coordinate of the detected point x, y.

Such calculations may provide a more precise and true value for the lateral angle .omega. between the camera 14 and the detected object. The lateral distance .psi. to the detected object may thus be calculated by the following equation (11): .psi.=.delta.*tan(.omega.). (11)

Accordingly, the actual distance T between camera 14 and the closest point on the detected object may be obtained by the following equation (12): .tau.=(.delta..sup.2+.psi..sup.2).sup.1/2. (12)

Because the lateral, longitudinal and total distances are calculated using certain known or obtainable characteristics and geometrical relationships, such as the input height of camera 14 above the ground, the pixel resolution of camera 14, the field of view of the camera, and a pixel count in the horizontal and vertical direction with respect to a target point or alignment target and/or the horizon, the calculated distance and/or angle values for each pixel count or location may be entered into a table to provide a rapid response time for determining the distance to the detected edge or object once the pixel count or location of the detected edge is known.

As discussed above, the lane change assist system may only be concerned with the longitudinal distance .delta. to the detected edge. Control 16 may thus determine a vertical pixel count and approximate the longitudinal distance to the detected object or edge via equations (1), (2) and (3) or via the data table, thereby significantly reducing the processing requirements of control 16 to estimate or calculate the distance to the detected edges or objects.

Additionally, control 16 may be operable to substantially eliminate or substantially ignore other forms or types of detected edges which are not likely or cannot be part of a vehicle in the adjacent lane. For example, as can be seen in FIG. 5, the tires and wheels 18d of an adjacent or approaching vehicle 18 are viewed as ellipses from a forward and sideward angle with respect to the adjacent vehicle. Because all vehicles on the road have tires, control 16 of lane change assist system 10 may be operable to process the signals from the pixels (such as the pixels directed toward the zone of interest) to detect the presence of one or more ellipses at or near the detected edges. If an ellipse or wheel is not detected, then the detected edges and associated object may be eliminated from processing by control 16, since it cannot be a vehicle in the adjacent lane. Detecting the presence of ellipses and wheels or portions thereof can thus assist in providing information regarding the existence of a vehicle and may assist in determining the position and/or velocity or relative position and/or relative velocity of the detected vehicle with respect to vehicle 12.

In order to further reduce the possibility of control 16 generating a false positive signal, control 16 of lane change assist system 10 may be operable to determine an intensity or brightness level associated with the detected edges and to substantially eliminate edges which do not significantly change in brightness level or intensity level from one side of the detected edge to the other. This is preferred, since lines in the road, thin branches on the road and/or many other small articles or objects may not resolve, and thus may result in single edges that do not significantly change in brightness or intensity (or color if a color system is used) across their detected edges. However, a significant change in brightness or intensity would be expected along a detected edge of an automotive body panel or bumper or other component or structure of a vehicle or the like. Accordingly, control 16 may substantially eliminate or substantially ignore edges or objects which do not have a significant brightness or intensity change thereacross, since an edge with an insignificant change in brightness or color signifies an insignificant edge which can be substantially eliminated. By substantially eliminating such insignificant edges, control 16 may further significantly reduce the computational requirements or processing requirements, while also significantly reducing the possibility of a false positive indication.

Control 16 may also be operable to compare image data from consecutive frames or images captured by camera 14 to confirm that a detected edge or object is representative of a vehicle in an adjacent lane and/or to determine the relative speed between the detected object or vehicle and the equipped or subject vehicle 12. By extracting collections of edges or points of interest, such as ellipses, bend maximums in edges and/or the like, from consecutive frames, and correlating such points of interest from one frame to the next, the lane change assist system of the present invention can more effectively verify the pairing of such characteristics or objects. The control may track or correlate the points of interest based on the placement or location of the edges within the captured images, the general direction of travel of the detected edges or groups of edges between consecutive frames, the dimensions, size and/or aspect ratio of the detected edges or objects and/or the like. Confirming such characteristics of edges and groups of edges and objects allows the lane change assist system to track the objects from one captured frame or image to the next. If the relative speed or movement of the detected edge or object is not indicative of the relative speed or movement of a vehicle in the adjacent lane, control 16 may filter out or substantially ignore such detected edges in further processing so as to reduce subsequent processing requirements and to avoid generation of a false positive signal. Lane change assist system 10 may also be operable to connect collections of such objects or edges based on relative motion between the subject vehicle and the detected object or edges. Such connected collections may provide information about the size and shape of the detected object for object classification and identification by control 16.

It is further envisioned that lane change assist system 10 may be operable in conjunction with a lane departure warning system or other forward facing imaging system 22 of vehicle 12, such as a lane departure warning system of the type discussed below or as disclosed in U.S. provisional application Ser. No. 60/377,524, filed May 3, 2002, which is hereby incorporated herein by reference, or any other lane departure warning system or the like, or a headlamp control system, such as disclosed in U.S. Pat. No. 5,796,094, which is hereby incorporated herein by reference, or any forwardly directed vehicle vision system, such as a vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 6,201,642 and 6,396,397, and/or in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, which are hereby incorporated herein by reference. The forward facing imaging system may provide an input to lane change assist system 10 to further reduce any likelihood of a false positive signal from the lane change assist system.

For example, the forward facing imaging system may detect lane markers at the road surface to detect a curvature in the road that the subject vehicle is approaching and/or traveling along. Such information may be communicated to lane change assist system 10, so that control 16 may adapt or shape the reduced image data set or zone of interest as the subject vehicle 12 enters and proceeds along the detected road curvature, as discussed above. Also, the forward facing imaging system may detect headlamps of oncoming or approaching vehicles. If the lane forward and to the left of vehicle 12 has oncoming traffic, control 16 may substantially ignore the left side of the vehicle, since the lane change assist system would not be concerned with a lane change into oncoming traffic. Also, the forward facing imaging system may detect the tail lights or rear portion of a leading vehicle in another lane, and may track the leading vehicle relative to the subject vehicle. As the subject vehicle overtakes the leading vehicle, the lane change assist system may then be alerted as to the presence of the overtaken vehicle, such that edges detected in that lane a short period of time after the overtaken vehicle leaves the range of the forward facing imaging system (the period of time may be calculated based on the relative velocity between the subject vehicle and the overtaken vehicle) may be readily identified as the now overtaken and passed vehicle. By utilizing the vehicle information of a vehicle detected by a forward facing imaging system, the lane change assist system of the present invention (or other side object detection systems or the like) may reduce the amount of processing of the captured images or detected edges, since such a vehicle may be readily identified as the vehicle that was previously detected by the forward facing imaging system. This avoids a duplication of efforts by the forward facing imaging system and lane change assist system of the vehicle.

By primarily processing image data and detected edges in certain areas and/or processing image data and detected edges that meet certain thresholds or criteria, and substantially rejecting or substantially ignoring other information or image data or edges, such as detected edges that are substantially non-horizontal, or other edges that cannot be part of a vehicle, or image data that are not representative of a zone of interest of the exterior scene, the lane change assist system of the present invention substantially reduces the image data to be processed by control 16. It is envisioned that such a reduction in the amount of image data to be processed may allow the lane change assist system to have a control which comprises a micro-processor positioned at the camera. Accordingly, the lane change assist system may be provided as a module which may be positioned at either or both sides of the vehicle, and which may be connected to an appropriate power source or control or accessory of the vehicle.

Therefore, the present invention provides a lane change assist system which is operable to detect and identify vehicles or other objects of interest sidewardly and/or rearwardly of the subject vehicle. The lane change assist system of the present invention is operable to detect edges of objects, and particularly horizontal edges of objects, to provide improved recognition or identification of the detected objects and reduced false positive signals from the lane change assist system. The lane change assist system may primarily process information or image data from a reduced set or subset of image data which is representative of a target zone or area of interest within the exterior scene and may adjust the reduced data set or target zone in response to driving or road conditions or the like. The edge detection process or algorithm of the lane change assist system of the present invention provides for a low cost processing system or algorithm, which does not require the statistical methodologies and computationally expensive flow algorithms of the prior art systems. Also, the edge detection process may detect edges and objects even when there is little or no relative movement between the subject vehicle and the detected edge or object.

The lane change assist system of the present invention thus may be operable to substantially ignore or substantially eliminate or reduce the effect of edges or characteristics which are indicative of insignificant objects, thereby reducing the level of processing required on the captured images and reducing the possibility of false positive detections. The lane change assist system may also provide a low cost and fast approximation of a longitudinal and/or lateral and/or total distance between the subject vehicle and a detected edge or object at a side of the vehicle and may adjust a threshold detection level in response to the approximated distance. The lane change assist system of the present invention may be operable to substantially ignore certain detected edges or provide a positive identification signal depending on the characteristics of the detected object or edge or edges, the driving or road conditions, and/or the distance from the subject vehicle. The present invention thus may provide a faster processing of the captured images, which may be performed by a processor having lower processing capabilities then processors required for the prior art systems.

Although the present invention is described above as a lane change assist or aid system or side object detection system, it is envisioned that many aspects of the imaging system of the present invention are suitable for use in other vehicle vision or imaging systems, such as other side object detection systems, forward facing vision systems, such as lane departure warning systems, forward park aids, passive steering systems, adaptive cruise control systems or the like, rearward facing vision systems, such as back up aids or park aids or the like, panoramic vision systems and/or the like.

For example, an object detection system or imaging system of the present invention may comprise a forward facing lane departure warning system 110 (FIG. 1), which may include an image sensor or camera 114 and a control 116 positioned on or at vehicle 12. Lane departure warning system 110 is generally shown at the front of the vehicle 12 with camera 114 positioned and oriented to capture an image of the region generally forwardly of the vehicle. However, the camera may optionally be positioned elsewhere at the vehicle, such as within the vehicle cabin, such as at an interior rearview mirror assembly of the vehicle or at an accessory module or the like, and directed forwardly through the windshield of the vehicle, without affecting the scope of the present invention. Camera 114 is operable to capture an image of a scene occurring forwardly of the vehicle and control 116 is operable to process image data of the captured images or frames to detect and monitor or track lane markers or road edges or the like or oncoming or approaching vehicles or objects, and to provide a warning or alert signal to a driver of the vehicle in response to the detected images, such as in the manner disclosed in U.S. provisional application Ser. No. 60/377,524, filed May 3, 2002, which is hereby incorporated herein by reference.

Similar to camera 14 of lane change assist system 10, discussed above, camera 114 may be positioned at vehicle 12 and oriented generally downwardly toward the ground to increase the horizontal pixel count across the captured image at the important areas in front of vehicle 12, since any significant lane marking or road edge or the like, or other vehicle approaching or being approached by the subject vehicle, positioned in front of or toward a side of the subject vehicle will be substantially below the horizon and thus substantially within the captured image. The lane departure warning system of the present invention thus may provide an increased portion of the captured image or increased pixel count at important areas of the exterior scene, since the area well above the road or horizon is not as significant to the detection of lane markers and the like and/or other vehicles. Additionally, positioning the camera to be angled generally downwardly also reduces the adverse effects that the sun and/or headlamps of other vehicles may have on the captured images.

Control 116 of lane departure warning system 110 may include an edge detection algorithm or function, such as described above, which is operable to process or may be applied to the individual pixels to determine whether the image captured by the pixels defines an edge or edges of a lane marker or the like. The edge detection function or algorithm of control 116 allows lane departure warning system 110 to interrogate complex patterns in the captured image and separate out particular patterns or edges which may be indicative of a lane marker or the like, and to substantially ignore other edges or patterns which are not or cannot be indicative of a lane marker or the like and thus are insignificant to lane departure warning system 110. Other information in the captured image or frame, which is not associated with significant edges, may then be substantially ignored or filtered out by control 116 via various filtering mechanisms or processing limitations to reduce the information being processed by control 116.

Control 116 may be operable to determine which detected edges are angled or diagonal across and along the captured image and to partially filter out or substantially ignore or limit processing of vertical and/or horizontal edges. This may be preferred, since edges indicative of a lane marker may be angled within the captured image, as can be seen with reference to FIGS. 7 and 8. The control may thus process edges which are angled and which move diagonally through the scene from one frame to the next. Control 116 may be operable to skew or bias the rows of pixels in the pixelated array to simulate horizontal edges with the angled edges, such that control may detect and track such edges while substantially ignoring other edges. Control 116 may thus reject or substantially ignore edges which are not indicative of lane markers or the like (and which are not indicative of another vehicle forward of and approaching the subject vehicle), thereby reducing the data to be processed.

In order to further reduce the processing requirements and the possibility of a false detection or indication of a lane marker, and to enhance the response time and system performance, control 116 may primarily process signals or image data from pixels that are oriented or targeted or arranged or selected to capture images of objects or markers that are at least partially positioned within a predetermined or targeted area or zone of interest of the exterior scene. The zone of interest may be defined by an area or region forwardly and toward one or both sides of the subject vehicle where a lane marker or road side or edge may be positioned, which would be significant or important to lane departure warning system 110. By substantially isolating the reduced data set representative of the zone of interest, or substantially filtering out or substantially ignoring edges or signals or image data which are representative of areas outside of the zone or area of interest, the present invention may reduce the image data or information to be processed by control 116 and may substantially reduce the possibility that a false detection of a lane marker or the like will occur. Lane departure warning system 110 may also process edges or image data within a further reduced image data set representative of a targeted portion or hot zone of the zone of interest to further identify and confirm that the detected edge or edges are indicative of a lane marker or the like or a vehicle or object that is significant to the lane departure warning system, such as discussed above with respect to lane change assist system 10.

By detecting edges of objects (such as lane markers, road edges, vehicles and the like) within the zone or area of interest (and optionally adjusting the zone of interest for particular driving conditions or situations), and by focusing on or concentrating on or primarily processing the detected edges or image data which may be indicative of a lane marker or vehicle or significant object, while substantially filtering out or substantially ignoring other edges or information or image data, the present invention substantially reduces the possibility of falsely detecting lane markers or other significant vehicles or objects. Control 116 may be further operable to determine a distance between a detected object and the subject vehicle to further filter out or substantially eliminate objects that are not within a predetermined range or threshold distance from the subject vehicle and which, thus, may be insignificant to the lane departure warning system of the present invention, such as described above with respect to lane change assist system 10.

Control 116 may also be operable to determine or estimate the distance to the detected edge or object in response to the location of the pixel or pixels on the pixelated array which capture the detected edge or object, such as in the manner also discussed above. The distance may thus be determined by determining the pixel location and accessing a table or data list or array to determine the distance associated with the particular pixel.

Control 116 of lane departure warning system 110 may also be operable to determine an intensity or brightness level associated with the detected edges and to substantially eliminate edges which do not significantly change in brightness level or intensity level from one side of the detected edge to the other. This is preferred, since thin branches on the road and/or many other small articles or objects may not resolve, and thus may result in single edges that do not significantly change in brightness or intensity (or color if a color system is used) across their detected edges. However, a sharp or significant change in brightness or intensity would be expected at a detected edge of a lane marker (since a lane marker is typically a white or yellow line segment along a dark or black or gray road surface) or an automotive body panel or bumper or other component or structure of a vehicle or the like. Accordingly, control 16 may substantially eliminate or substantially ignore edges or objects which do not have a significant brightness or intensity change thereacross. By substantially eliminating such insignificant edges, control 16 may further significantly reduce the computational requirements or processing requirements, while also significantly reducing the possibility of a false detection of a lane marker or vehicle. It is further envisioned that lane departure warning system 110 may be capable of detecting lane markings and road edges and other vehicles and modifying the alert signal or process in response to the type of marking, surrounding vehicles or the like and/or the vehicle movement, such as disclosed in U.S. provisional application Ser. No. 60/377,524, filed May 3, 2002, which is hereby incorporated herein by reference.

With reference to FIGS. 12 and 13, lane departure warning system 110 may provide a warning signal to a driver of vehicle 12 when the vehicle is about to depart from its lane or road 113. The lane departure warning system 110 is operable in response to imaging sensor or camera 114 positioned at a forward portion of the vehicle 12 (and may be positioned at a vehicle bumper area or at a windshield area, such as at an interior rearview mirror or attachment thereto, without affecting the scope of the present invention) and having a field of view directed generally forwardly with respect to the direction of travel of vehicle 12. The imaging sensor 114 is operable to capture an image of a scene generally forwardly (and preferably at least partially sidewardly) of the vehicle. The lane departure warning system includes image processing controls or devices which may process the images captured to detect and identify various objects within the image.

The imaging sensor useful with the present invention is preferably an imaging array sensor, such as a CMOS sensor or a CCD sensor or the like, such as disclosed in commonly assigned U.S. Pat. Nos. 5,550,677; 5,670,935; 5,796,094 and 6,097,023, and U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149, which are hereby incorporated herein by reference. The imaging sensor may be implemented and operated in connection with other vehicular systems as well, or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. No. 5,796,094, which is hereby incorporated herein by reference, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454 and/or 6,320,176, which are hereby incorporated herein by reference, a vehicle vision system, such as a forwardly directed vehicle vision system utilizing the principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935 and 6,201,642, and/or in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, which are hereby incorporated herein by reference, a traffic sign recognition system, a system for determining a distance to a leading vehicle or object, such as using the principles disclosed in U.S. patent application Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, which is hereby incorporated herein by reference, and/or the like.

The lane departure warning system of the present invention is operable to provide a warning signal to a driver of the vehicle under at least one of at least the following three conditions:

1) the vehicle is moving toward the edge of the road at a rapid speed indicating that the vehicle will actually depart from the pavement or shoulder;

2) the vehicle is moving into a lane with oncoming traffic present in that lane; and/or

3) the vehicle is moving into a lane with traffic flowing in the same direction and there is an adjacent vehicle in that lane (regardless of turn signal use).

The lane departure warning system may be operable in response to one or more of the detected conditions and may be further operable in response to various vehicle characteristics or parameters, such as vehicle speed, a distance to the lane marker, shoulder, other vehicle, or any other relevant distance, road conditions, driving conditions, and/or the like.

With respect to the first condition (shown in FIG. 12), the lane departure warning system may be operable in response to a single forward facing imaging sensor or camera, to establish and track the road edge as defined by two thresholds:

1) threshold 1: the edge 113a of the road or pavement 113 (the intersection of the pavement 113 and the shoulder 113b); and/or

2) threshold 2: the edge 113c of the shoulder 113b (the intersection of the shoulder 113b and the grass 113d).

The lane departure warning system of the present invention may then be operable to provide an audible warning, such as a rumble strip sound, when the vehicle is approaching threshold 1 and the vehicle is moving above an established speed. The lane departure warning system may then be operable to provide a more urgent audible warning, such as an alarm, when the vehicle is approaching threshold 2 and is moving above the established speed. If the road does not have a shoulder, such as on some rural roads, there is only one threshold and this may correspond to a threshold 2 warning. The lane departure warning system may be operable to provide the warning signal or signals in response to the vehicle being a particular distance from the detected lane or road or shoulder. The distances to the threshold markings at which the lane departure warning system initiates the warning signal or signals may vary depending on the speed of the vehicle, or other conditions surrounding the vehicle, such as road conditions, driving conditions, or the like.

With respect to the second condition, the lane departure warning system may be operable in response to a single forward facing camera to monitor the lane markings 113e along the road surface and monitor the potential presence of oncoming traffic in an adjacent lane or lanes. Once the presence of oncoming traffic has been established, the lane departure warning system may issue an urgent audible warning if the vehicle begins to cross the lane marking 113e. Furthermore, if the vehicle has already begun to cross into the oncoming traffic lane before oncoming traffic is detected, the lane departure warning system may issue the urgent warning when oncoming traffic is detected.

Similar to the first condition, the lane departure warning system may be operable in response to the second condition to initiate the warning signal in response to different distances between the subject vehicle and the approaching vehicle, depending on the speed of one or both vehicles, the driving conditions, the road conditions and/or the like.

With respect to the third condition (shown in FIG. 13), the lane departure warning system of the present invention may be operable in response to a single forward facing camera and at least one, and optionally two, rearward and/or sideward facing cameras, to monitor the lane markings and the potential presence of adjacent traffic or vehicle or vehicles 112 in an adjacent lane 113f, which may be traveling in the same direction as the subject vehicle 12. Once the presence of adjacent traffic has been established, the lane departure warning system may issue an urgent audible warning to the driver of the vehicle if the subject vehicle begins to cross the lane marking 113e. Furthermore, if the subject vehicle has already begun to cross into the adjacent lane and then subsequently an adjacent vehicle is detected, the lane departure warning system may issue the urgent warning signal to the driver of the vehicle.

Again, the lane departure warning system may be operable to initiate the warning signal or signals in response to varying threshold parameters, which may vary depending on the speed of the subject vehicle, the speed of the other detected vehicle, the relative speed of the vehicles, the driving conditions, the road conditions and/or the like. The lane departure warning system of the present invention may be operable to differentiate between the different types of lane markings along roads, such as between solid and dashed lines and double lines.

Optionally, the lane departure warning system may be further operable to detect and recognize stop lights and/or stop signs and/or other road or street signs or markings, and to provide a warning signal to the driver of the vehicle in response to such detection. It is further envisioned that the lane departure warning system of the present invention may be operable to provide an alarm or broadcast an alarm or warning signal on a safety warning band when the forward facing camera detects a stop light or stop sign and the system determines that the vehicle is not going to stop based on the vehicle's current speed and deceleration. This provides a signal or alarm to crossing drivers to warn them of an unsafe condition.

Optionally, the lane departure warning system of the present invention may be operable to determine the road conditions of the road on which the vehicle is traveling and/or the driving conditions surrounding the vehicle. The system may then provide the warning signal or signals in response to variable threshold values, such as different vehicle speeds or distances or the like. For example, wet or snowy roads would change the distance and/or speed thresholds at which the lane departure warning system would provide the warning signal or signals. Also, because darkened or raining conditions may affect visibility of lane markers, road edges and other vehicles, the lane departure warning system of the present invention may be operable to provide a warning signal sooner or at a greater distance from the marker, edge or vehicle in such low visibility conditions. This provides the driver of the subject vehicle a greater amount of time to respond to the warning in such conditions.

The lane departure warning system of the present invention may be integrated with a side object detection system (SOD). For example, the vehicle may be equipped with a camera or image-based side object detection system or a Doppler radar-based side object detection system or other such systems (such as mounted on the side rearview mirrors or at the side of the vehicle) for detecting objects and/or vehicles at one or both sides of the subject vehicle. The lane departure warning threshold level or sensitivity at which the lane departure warning system generates a warning signal may then be adjustable in response to detection of a vehicle or object at a side of the subject vehicle and determination of the location and speed of the detected vehicle. Optionally, the signal generated may increase or decrease in intensity or volume in response to the position or speed of an object or vehicle detected by the side object detection system. For example, the threshold level may take into account the approach speed of the other vehicle to the subject vehicle, and may provide a louder or brighter warning to the driver of the subject vehicle if the approach speed is above a particular threshold level or threshold levels.

The lane departure warning system may be provided with a multi-feature or multi-function forward facing imaging system. The imaging system may combine two or more functions, such as an intelligent headlamp controller (such as the type disclosed in U.S. Pat. Nos. 5,796,094 and 6,097,023, and U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149, which are hereby incorporated herein by reference), an image-based smart wiper controller, a rain sensor (such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454 and/or 6,320,176, which are hereby incorporated herein by reference), an image-based climate control blower controller, an image-based or image-derived or partially derived adaptive cruise-control system (where the imaging may be primary or secondary to a forward facing Doppler radar), and/or other vision systems (such as a forwardly directed vehicle vision system utilizing the principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935 and 6,201,642, and/or in U.S. patent application Ser. No. 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, which are all hereby incorporated herein by reference), a traffic sign recognition system, a system for determining a distance to a leading vehicle or object (such as using the principles disclosed in U.S. patent application Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397, which is hereby incorporated herein by reference), and/or the like.

For example, an embodiment of the lane departure warning system of the present invention may be incorporated with or integrated with an intelligent headlamp control system (such as described in U.S. Pat. Nos. 5,796,094 and 6,097,023, and U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149, which are hereby incorporated herein by reference) having an imaging array sensor feeding a signal or image to a microcontroller (which may comprise a microprocessor or microcomputer), which is operable to adjust a state of the headlamps in response to a captured image of the scene forwardly of the vehicle. The image captured by the imaging sensor may be analyzed for light sources of interest for the headlamp control, and also for lane markings, road edges, and other objects of interest (such as road signs, stop signs, stop lights and/or the like) for the lane departure warning system. Optionally, the lane departure warning system may be integrated with or tied to an existing headlamp control of the vehicle.

The lane departure warning system of the present invention thus may be implemented as part of one or more other imaging-based systems, and thus may share components, hardware and/or software with the other systems to reduce the incremental costs associated with the lane departure warning system and with the other systems as well. Accordingly, multiple systems may be provided by an automotive supplier as part of a common platform or module for each vehicle of a particular vehicle line or model. The vehicle manufacturer may then choose to activate or enable one or more of the systems of the module, depending on which options are selected on a particular vehicle. Therefore, the addition or selection of the lane departure warning system, or of one or more other imaging-based systems, is associated with an incremental addition of hardware and/or software, and thus of associated costs, in order to install and enable the system on a particular vehicle. The imaging array sensor or sensors of the module may then be interrogated by an appropriate processor or software to extract the light sources or objects of interest or pixels of interest for each respective system of the common or unitary module. For example, an image captured by the imaging array sensor or camera may be processed or analyzed one way for a headlamp control system, and then processed or analyzed another way for the lane departure warning system or for any other enabled functions or systems of the common module. The software may further include common blocks or functions or macros to further enhance the sharing of software between the systems.

Accordingly, a unitary module may be provided to a vehicle assembly plant and may include multiple features, systems or functions, such that the desired features, systems or functions may be enabled for a particular vehicle, with minimal additional software or components or hardware being associated with the features, systems or functions that are enabled. The anchor system of the common or unitary module or platform may be an intelligent headlamp controller, with the additional systems, such as the lane departure warning system of the present invention, being added to or integrated with the anchor system.

The lane departure warning system and any other associated imaging-based systems may be included as part of an interior rearview mirror assembly or as part of an electronic windshield module and/or accessory module assembly, such as disclosed in commonly assigned U.S. Pat. Nos. 6,243,003; 6,278,377 and 6,433,676; U.S. application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; and Ser. No. 09/792,002, filed Feb. 26, 2001, now U.S. Pat. No. 6,690,268; Ser. No. 09/585,379, filed Jun. 1, 2000; Ser. No. 09/466,010, filed Dec. 17, 1999, now U.S. Pat. No. 6,420,975; and Ser. No. 10/355,454, filed Jan. 31, 2003, now U.S. Pat. No. 6,824,281, which are all hereby incorporated herein by reference.

Therefore, the lane departure warning system of the present invention provides a warning signal or signals to a driver of a vehicle based on the detection of various objects, vehicles and conditions surrounding the vehicle. The lane departure warning system of the present invention is thus less likely to provide a warning signal to a driver of the vehicle when the driver intends to maneuver the vehicle in that manner, and thus where such a warning signal is not needed or wanted. The lane departure warning system of the present invention thus avoids annoying, unnecessary warnings, and thus provides improved responses by the driver of the vehicle, since the driver is less likely to ignore the signal provided by the lane departure warning system. The lane departure warning system of the present invention may be implemented with or integrated with one or more other imaging-based systems to reduce the incremental components, hardware, software and costs associated with the implementation of the lane departure warning system.

Optionally, the object detection system or imaging system of the present invention may be operable in conjunction with a passive steering system 210 (FIG. 1), which is operable to adjust or bias the steering direction of the vehicle toward a center region of a lane in response to detection of the lane markers or road edges or the like by the imaging system. Passive steering system 210 may be in communication with or connected to a steering system of the vehicle and may adjust or bias the steering direction of the vehicle slightly if the lane departure warning system detects a slow drifting of the vehicle out of its lane and may be further operable in response to a detected road curvature ahead of the vehicle. The passive steering system 210 may steer the vehicle back into its lane or keep the vehicle in its lane when such a drifting condition is detected. The passive steering system may function to bias the steering of the vehicle toward the center of the occupied lane, but may be easily overcome by manual steering of the vehicle by the driver, such that the driver at all times maintains ultimate control over the steering of the vehicle. The passive steering system thus may function as a lane detent which maintains the vehicle in its lane, but may be easily overcome or disabled if the steering wheel is manually turned or if a turn signal is activated or the like.

The passive steering assist system of the present invention thus may reduce driver fatigue from driving a vehicle under conditions which require constant driver steering input or adjustment, such as in windy conditions and the like. The passive steering assist system thus may reduce lane drift from side to side within a lane. Also, overall safety may be improved by the reduction in undesired lane maneuvers. Although described as being responsive to the imaging system of the present invention, the passive steering system of the present invention may be responsive to other types of lane departure warning systems or other types of vision or imaging systems, without affecting the scope of the present invention.

Optionally, the object detection system or imaging system of the present invention may be operable in connection with an adaptive speed control system 310 (FIG. 1), which may adjust the cruise control setting or speed of the vehicle in response to road or traffic conditions detected by the imaging system. For example, adaptive speed control system 310 may reduce the set speed of the vehicle in response to the imaging system (or other forward facing vision system) detecting a curve in the road ahead of the vehicle. The vehicle speed may be reduced to an appropriate speed for traveling around the curve without the driver having to manually deactivate the cruise control. The adaptive speed control may then resume the initial speed setting after the vehicle is through the turn or curve and is again traveling along a generally straight section of road. Adaptive speed control 310 may also reduce the speed of the vehicle or even deactivate the cruise control setting in response to a detection by the lane departure warning system or other vision system of taillights or headlamps of another vehicle detected in front of the subject vehicle and within a threshold distance of the subject vehicle or approaching the subject vehicle at a speed greater than a threshold approach speed, or in response to detection of other objects or conditions which may indicate that the speed of the vehicle should be reduced.

Additionally, because the imaging system, such as a forward facing lane departure warning system, may track the lane curvature, the system may also be able to determine if a vehicle which appears in front of the subject vehicle is actually in the same lane as the subject vehicle or if it is in an adjacent lane which is curving with the section of road. The imaging system and adaptive speed control system may then establish if the vehicle speed should be reduced in response to the road curvature and the presence of another vehicle at the curve. Although described as being responsive to the imaging system or lane departure warning system of the present invention, the adaptive speed control system of the present invention may be responsive to other types of lane departure warning systems or other types of vision or imaging systems, particularly other types of forward facing imaging systems, without affecting the scope of the present invention.

It is further envisioned that the imaging system, which may comprise an object detection system, a lane change assist system, a side object detection system, a lane departure warning system or other forward facing vision system, a rear vision system or park aid or panoramic view system, a passive steering system, an adaptive cruise control system or the like, may be in communication with a security monitoring system. The vision or image data from the imaging system may be transmitted to a remote device, such as the vehicle owner's computer monitor or other personal display system remote from the vehicle, so that the owner of the vehicle or other person may view the status of the area surrounding the vehicle when the owner or other person is not in the vehicle. Also, the vision or image data may be provided to or made available to the local police authorities or the like in the event of a theft of the vehicle or of an accident involving the vehicle or of the vehicle being otherwise inoperable (such as when the motorist is stranded). The police or emergency services or the like may use the vision or image data to determine the vehicle location and possibly the condition of the vehicle and/or the driver and/or the passengers. It is further envisioned that the vision or image data may be used in conjunction with the global positioning system (GPS) of the vehicle to precisely locate or pinpoint the vehicle location. The vision or image data may be transmitted to the remote device or to the emergency services or the like via various transmission devices or systems, such as utilizing Bluetooth technology or the like, without affecting the scope of the present invention.

Therefore, the present invention provides a vision or imaging system or object detection system which is operable to detect and process edges within a captured image or images to determine if the edges are indicative of a significant vehicle or object or the like at or near or approaching the subject vehicle. The imaging system may primarily process a reduced image data set representative of a zone of interest of the exterior scene and may process edges that are detected which are representative of an object within the zone of interest of the captured image. The imaging system may adjust the reduced data set or zone of interest in response to various conditions or characteristics or criteria. The imaging system may comprise an object detection system or a lane change assist system operable to detect objects or other vehicles at one or both sides of the subject vehicle. The object detection system may determine the distance to the detected object or edge and may adjust threshold criterion in response to the determined or estimated or calculated distance.

Optionally, the imaging system may comprise a forward facing lane departure warning system which may be operable to detect lane markers or the like and/or vehicles in front of the subject vehicle and to provide an alert signal to the driver of the vehicle that the vehicle is leaving its lane. The lane departure warning system may primarily process edges detected within a zone of interest within the captured image. The lane departure warning system may determine a distance to the detected edge or object and may vary or adjust threshold criterion in response to the determined or estimated or calculated distance.

The forward facing imaging system may be in communication with the lane change assist system of the vehicle and/or may be in communication with other systems of the vehicle, such as a side object detection system, a passive steering system or an adaptive speed control system or the like. The imaging system may communicate to the lane change assist system that the vehicle is approaching a curve in the road or that another vehicle is being approached and passed by the subject vehicle to assist in processing the image data captured by the sensor or camera of the lane change assist system. Optionally, a passive steering system may adjust a steering direction of the vehicle in response to the imaging system, or an adaptive speed control system may adjust a cruise control setting of the vehicle in response to the imaging system. Optionally, an output of the imaging system may be provided to or communicated to a remote receiving and display system to provide image data for viewing at a location remote from the subject vehicle.

Changes and modifications in the specifically described embodiments may be carried our without departing from the principles of the present invention, which is intended to limited only by the scope of the appended claims, as interpreted according to the principles of patent law.

* * * * *


Яндекс.Метрика