Выделить слова: 


Патент США №

10107905

Автор(ы)

Lynam

Дата выдачи

23 октября 2018 г.


Forward facing sensing system for vehicle



РЕФЕРАТ

A forward facing sensing system for a vehicle includes a radar sensor and an image sensor that sense and view forward of the vehicle. The radar sensor and the image sensor are housed in a self-contained unit disposed behind and attached at the vehicle windshield. A control includes an image processor operable to analyze image data captured by the image sensor in order to, at least in part, detect an object present exterior of the vehicle. The control, responsive at least in part to processing of captured image data and to sensing by the radar sensor, determines that a potentially hazardous condition may exist in the path of forward travel of the vehicle. The radar sensor and the image sensor collaborate in a way that enhances sensing capability of the sensing system for the object in the path of forward travel of the vehicle.


Авторы:

Niall R. Lynam (Holland, MI)

Патентообладатель:

ИмяГородШтатСтранаТип

MAGNA ELECTRONICS INC.

Auburn Hills

MI

US

Заявитель:

MAGNA ELECTRONICS INC. (Auburn Hills, MI)

ID семейства патентов

39864604

Номер заявки:

15/361,746

Дата регистрации:

28 ноября 2016 г.

Prior Publication Data

Document IdentifierPublication Date
US 20170074981 A1Mar 16, 2017

Отсылочные патентные документы США


Application NumberFiling DatePatent NumberIssue Date
15149338May 9, 20169507021
15005092May 10, 20169335411
14859683Jan 26, 20169244165
14107624Sep 22, 20159140789
13656975Dec 24, 20138614640
13540856Oct 23, 20128294608
13192525Jul 10, 20128217830
12524446Sep 6, 20118013780
PCT/US2008/051833Jan 24, 2008
60886568Jan 25, 2007

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

B60T 7/22 (20130101); G01S 13/931 (20130101); G01S 13/867 (20130101); G01S 17/023 (20130101); G01S 7/03 (20130101); G01S 7/032 (20130101); G01S 13/86 (20130101); G01S 13/93 (20130101); G01S 2013/9392 (20130101); G01S 2013/9357 (20130101); G01S 2013/0245 (20130101); G01S 2013/9321 (20130101)

Класс международной патентной классификации (МПК):

G01S 13/93 (20060101); G01S 7/03 (20060101); G01S 17/02 (20060101); B60T 7/22 (20060101); G01S 13/86 (20060101); G01S 13/02 (20060101)

Использованные источники

[Referenced By]

Патентные документы США

4571082February 1986Downs
4572619February 1986Reininger
4580875April 1986Bechtel
4600913July 1986Caine
4603946August 1986Kato
4614415September 1986Hyatt
4620141October 1986McCumber et al.
4623222November 1986Itoh
4626850December 1986Chey
4629941December 1986Ellis
4630109December 1986Barton
4632509December 1986Ohmi
4638287January 1987Umebayashi et al.
4647161March 1987Muller
4669825June 1987Itoh
4669826June 1987Itoh
4671615June 1987Fukada
4672457June 1987Hyatt
4676601June 1987Itoh
4690508September 1987Jacob
4692798September 1987Seko et al.
4697883October 1987Suzuki
4701022October 1987Jacob
4713685December 1987Nishimura et al.
4717830January 1988Botts
4727290February 1988Smith
4731669March 1988Hayashi et al.
4741603May 1988Miyagi
4768135August 1988Kretschmer et al.
4789904December 1988Peterson
4793690December 1988Gahan
4817948April 1989Simonelli
4820933April 1989Hong
4825232April 1989Howdle
4838650June 1989Stewart
4847772July 1989Michalopoulos et al.
4862037August 1989Farber et al.
4867561September 1989Fujii et al.
4871917October 1989O'Farrell et al.
4872051October 1989Dye
4881019November 1989Shiraishi et al.
4882565November 1989Gallmeyer
4886960December 1989Molyneux
4891559January 1990Matsumoto et al.
4892345January 1990Rachael, III
4895790January 1990Swanson et al.
4896030January 1990Miyaji
4907870March 1990Brucker
4910591March 1990Petrossian et al.
4916374April 1990Schierbeek
4917477April 1990Bechtel et al.
4937796June 1990Tendler
4943796July 1990Lee
4953305September 1990Van Lente et al.
4956591September 1990Schierbeek
4961625October 1990Wood et al.
4967319October 1990Seko
4970653November 1990Kenue
4971430November 1990Lynas
4974078November 1990Tsai
4987357January 1991Masaki
4991054February 1991Walters
5001558March 1991Burley et al.
5003288March 1991Wilhelm
5012082April 1991Watanabe
5016977May 1991Baude et al.
5027001June 1991Torbert
5027200June 1991Petrossian et al.
5044706September 1991Chen
5055668October 1991French
5059877October 1991Teder
5064274November 1991Alten
5072154December 1991Chen
5086253February 1992Lawler
5096287March 1992Kakinami et al.
5097362March 1992Lynas
5121200June 1992Choi
5124549June 1992Michaels et al.
5130709July 1992Toyama et al.
5148014September 1992Lynam
5168378December 1992Black
5170374December 1992Shimohigashi et al.
5172235December 1992Wilm et al.
5177685January 1993Davis et al.
5182502January 1993Slotkowski et al.
5184956February 1993Langlais et al.
5189561February 1993Hong
5193000March 1993Lipton et al.
5193029March 1993Schofield
5204778April 1993Bechtel
5208701May 1993Maeda
5245422September 1993Borcherts et al.
5253109October 1993O'Farrell
5276389January 1994Levers
5285060February 1994Larson et al.
5289182February 1994Brillard et al.
5289321February 1994Secor
5305012April 1994Faris
5307136April 1994Saneyoshi
5309137May 1994Kajiwara
5309163May 1994Ngan et al.
5313072May 1994Vachss
5325096June 1994Pakett
5325386June 1994Jewell et al.
5329206July 1994Slotkowski et al.
5331312July 1994Kudoh
5336980August 1994Levers
5341437August 1994Nakayama
5351044September 1994Mathur et al.
5355118October 1994Fukuhara
5374852December 1994Parkes
5381155January 1995Gerber
5386285January 1995Asayama
5394333February 1995Kao
5406395April 1995Wilson et al.
5410346April 1995Saneyoshi et al.
5414257May 1995Stanton
5414461May 1995Kishi et al.
5416313May 1995Larson et al.
5416318May 1995Hegyi
5416478May 1995Morinaga
5424952June 1995Asayama
5426294June 1995Kobayashi et al.
5430431July 1995Nelson
5434407July 1995Bauer et al.
5440428August 1995Hegg et al.
5444478August 1995Lelong et al.
5451822September 1995Bechtel et al.
5457493October 1995Leddy et al.
5461357October 1995Yoshioka et al.
5461361October 1995Moore
5469298November 1995Suman et al.
5471515November 1995Fossum et al.
5475494December 1995Nishida et al.
5498866March 1996Bendicks et al.
5500766March 1996Stonecypher
5510983April 1996Iino
5515042May 1996Nelson
5515448May 1996Nishitani
5521633May 1996Nakajima et al.
5528698June 1996Kamei et al.
5529138June 1996Shaw et al.
5530240June 1996Larson et al.
5530420June 1996Tsuchiya et al.
5535314July 1996Alves et al.
5537003July 1996Bechtel et al.
5539397July 1996Asanuma et al.
5541590July 1996Nishio
5550677August 1996Schofield et al.
5568027October 1996Teder
5574443November 1996Hsieh
5581464December 1996Woll et al.
5585798December 1996Yoshioka et al.
5594222January 1997Caldwell
5614788March 1997Mullins
5619370April 1997Guinosso
5634709June 1997Iwama
5642299June 1997Hardin et al.
5648835July 1997Uzawa
5650944July 1997Kise
5657021August 1997Ehsani-Nategh et al.
5660454August 1997Mori et al.
5661303August 1997Teder
5666028September 1997Bechtel et al.
5668663September 1997Varaprasad et al.
5670935September 1997Schofield et al.
5677851October 1997Kingdon et al.
5699044December 1997Van Lente et al.
5715093February 1998Schierbeek et al.
5724187March 1998Varaprasad et al.
5724316March 1998Brunts
5737226April 1998Olson et al.
5760826June 1998Nayer
5760828June 1998Cortes
5760931June 1998Saburi et al.
5761094June 1998Olson et al.
5765116June 1998Wilson-Jones et al.
5781437July 1998Wiemer et al.
5786772July 1998Schofield et al.
5790403August 1998Nakayama
5790973August 1998Blaker et al.
5793308August 1998Rosinski et al.
5793420August 1998Schmidt
5796094August 1998Schofield et al.
5798575August 1998O'Farrell et al.
5835255November 1998Miles
5837994November 1998Stam et al.
5844505December 1998Van Ryzin
5844682December 1998Kiyomoto et al.
5845000December 1998Breed et al.
5847676December 1998Cole
5848802December 1998Breed et al.
5850176December 1998Kinoshita et al.
5850254December 1998Takano et al.
5867591February 1999Onda
5872536February 1999Lyons et al.
5877707March 1999Kowalick
5877897March 1999Schofield et al.
5878370March 1999Olson
5883739March 1999Ashihara et al.
5884212March 1999Lion
5890021March 1999Onoda
5896085April 1999Mori et al.
5899956May 1999Chan
5914815June 1999Bos
5923027July 1999Stam et al.
5929786July 1999Schofield et al.
5933109August 1999Tohya et al.
5938717August 1999Dunne et al.
5940120August 1999Frankhouse et al.
5956181September 1999Lin
5959367September 1999O'Farrell et al.
5959555September 1999Furuta
5959571September 1999Aoyagi et al.
5963247October 1999Banitt
5971552October 1999O'Farrell et al.
5986796November 1999Miles
5990469November 1999Bechtel et al.
5990649November 1999Nagao et al.
6001486December 1999Varaprasad et al.
6020704February 2000Buschur
6049171April 2000Stam et al.
6057754May 2000Kinoshita et al.
6066933May 2000Ponziana
6067110May 2000Nonaka et al.
6075492June 2000Schmidt et al.
6084519July 2000Coulling et al.
6085151July 2000Farmer et al.
6087953July 2000DeLine et al.
6097024August 2000Stam et al.
6116743September 2000Hoek
6118401September 2000Tognazzini
6118410September 2000Nagy
6124647September 2000Marcus et al.
6124886September 2000DeLine et al.
6139172October 2000Bos et al.
6144022November 2000Tenenbaum et al.
6172613January 2001DeLine et al.
6175164January 2001O'Farrell et al.
6175300January 2001Kendrick
6198409March 2001Schofield et al.
6201642March 2001Bos
6216540April 2001Nelson et al.
6222460April 2001DeLine et al.
6243003June 2001DeLine et al.
6250148June 2001Lynam
6259412July 2001Duroux
6266082July 2001Yonezawa et al.
6266442July 2001Laumeyer et al.
6278399August 2001Ashihara
6285393September 2001Shimoura et al.
6291906September 2001Marcus et al.
6294989September 2001Schofield et al.
6297781October 2001Turnbull et al.
6302545October 2001Schofield et al.
6310611October 2001Caldwell
6313454November 2001Bos et al.
6317057November 2001Lee
6320282November 2001Caldwell
6323477November 2001Blasing et al.
6326613December 2001Heslin et al.
6329925December 2001Skiver et al.
6333759December 2001Mazzilli
6341523January 2002Lynam
6353392March 2002Schofield et al.
6366213April 2002DeLine et al.
6370329April 2002Teuchert
6396397May 2002Bos et al.
6411204June 2002Bloomfield et al.
6411328June 2002Franke et al.
6420975July 2002DeLine et al.
6424273July 2002Gutta et al.
6428172August 2002Hutzel et al.
6430303August 2002Naoi et al.
6433676August 2002DeLine et al.
6442465August 2002Breed et al.
6452148September 2002Bendicks et al.
6462700October 2002Schmidt et al.
6477464November 2002McCarthy et al.
6485155November 2002Duroux et al.
6492935December 2002Higuchi
6497503December 2002Dassanayake et al.
6498620December 2002Schofield et al.
6513252February 2003Schierbeek et al.
6516664February 2003Lynam
6523964February 2003Schofield et al.
6534884March 2003Marcus et al.
6539306March 2003Turnbull
6547133April 2003DeVries, Jr. et al.
6553130April 2003Lemelson et al.
6555804April 2003Blasing
6574033June 2003Chui et al.
6580385June 2003Winner et al.
6589625July 2003Kothari et al.
6593565July 2003Heslin et al.
6594583July 2003Ogura et al.
6611202August 2003Schofield et al.
6611610August 2003Stam et al.
6627918September 2003Getz et al.
6636148October 2003Higuchi
6636258October 2003Strumolo
6648477November 2003Hutzel et al.
6650233November 2003DeLine et al.
6650455November 2003Miles
6672731January 2004Schnell et al.
6674562January 2004Miles
6678614January 2004McCarthy et al.
6680792January 2004Miles
6690268February 2004Schofield et al.
6696978February 2004Trajkovic et al.
6700605March 2004Toyoda et al.
6703925March 2004Steffel
6704621March 2004Stein et al.
6710908March 2004Miles et al.
6711474March 2004Treyz et al.
6714331March 2004Lewis et al.
6717610April 2004Bos et al.
6727807April 2004Trajkovic et al.
6735506May 2004Breed et al.
6741377May 2004Miles
6744353June 2004Sjonell
6757109June 2004Bos
6762867July 2004Lippert et al.
6771208August 2004Lutter et al.
6772057August 2004Breed et al.
6794119September 2004Miles
6795014September 2004Cheong
6795221September 2004Urey
6802617October 2004Schofield et al.
6806452October 2004Bos et al.
6812882November 2004Ono
6816084November 2004Stein
6818884November 2004Koch et al.
6822563November 2004Bos et al.
6823241November 2004Shirato et al.
6823244November 2004Breed
6824281November 2004Schofield et al.
6828903December 2004Watanabe et al.
6831591December 2004Horibe
6838980January 2005Gloger et al.
6841767January 2005Mindl et al.
6847487January 2005Burgner
6856873February 2005Breed et al.
6859705February 2005Rao et al.
6864784March 2005Loeb
6873912March 2005Shimomura
6879281April 2005Gresham et al.
6882287April 2005Schofield
6889161May 2005Winner et al.
6903677June 2005Takashima et al.
6909753June 2005Meehan et al.
6941211September 2005Kuroda et al.
6944544September 2005Prakah-Asante et al.
6946978September 2005Schofield
6947577September 2005Stam et al.
6953253October 2005Schofield et al.
6958729October 2005Metz
6968736November 2005Lynam
6975390December 2005Mindl et al.
6975775December 2005Rykowski et al.
6987419January 2006Gresham
6999024February 2006Kumon et al.
7004593February 2006Weller et al.
7004606February 2006Schofield
7005974February 2006McMahon et al.
7012560March 2006Braeuchle et al.
7038577May 2006Pawlicki et al.
7042389May 2006Shirai
7062300June 2006Kim
7065432June 2006Moisel et al.
7085637August 2006Breed et al.
7092548August 2006Laumeyer et al.
7116246October 2006Winter et al.
7123168October 2006Schofield
7126460October 2006Yamada
7126525October 2006Suzuki et al.
7149613December 2006Stam et al.
7167796January 2007Taylor et al.
7176830February 2007Horibe
7188963March 2007Schofield et al.
7196305March 2007Shaffer et al.
7199747April 2007Jenkins et al.
7202776April 2007Breed
7227459June 2007Bos et al.
7227611June 2007Hull et al.
7311406December 2007Schofield et al.
7322755January 2008Neumann et al.
7325934February 2008Schofield et al.
7325935February 2008Schofield et al.
7344261March 2008Schofield et al.
7380948June 2008Schofield et al.
7388182June 2008Schofield et al.
7400266July 2008Haug
7423248September 2008Schofield et al.
7425076September 2008Schofield et al.
7432848October 2008Munakata
7436038October 2008Engelmann et al.
7439507October 2008Deasy et al.
7453374November 2008Koike et al.
7460951December 2008Altan et al.
7480149January 2009DeWard et al.
7526103April 2009Schofield et al.
7542835June 2009Takahama et al.
7558007July 2009Katoh et al.
7570198August 2009Tokoro et al.
7587072September 2009Russo et al.
7613568November 2009Kawasaki
7619508November 2009Lynam et al.
7619562November 2009Stumbo et al.
7633383December 2009Dunsmoir et al.
7639149December 2009Katoh
7671806March 2010Voigtlaender
7706978April 2010Schiffmann et al.
7720580May 2010Higgins-Luthman
7728272June 2010Blaesing
7765065July 2010Stiller
7777669August 2010Tokoro et al.
7811011October 2010Blaesing et al.
7828478November 2010Rege et al.
7855353December 2010Blaesing et al.
7881496February 2011Camilleri et al.
7914187March 2011Higgins-Luthman et al.
7920251April 2011Chung
7978122July 2011Schmidlin
8013780September 2011Lynam
8192095June 2012Kortan et al.
8217830July 2012Lynam
8294608October 2012Lynam
8614640December 2013Lynam
9140789September 2015Lynam
9244165January 2016Lynam
9335411May 2016Lynam
9507021November 2016Lynam
2002/0015153February 2002Downs
2002/0021229February 2002Stein
2002/0044065April 2002Quist et al.
2002/0113873August 2002Williams
2002/0159270October 2002Lynam et al.
2003/0080878May 2003Kirmuss
2003/0112132June 2003Trajkovic et al.
2003/0137586July 2003Lewellen
2003/0138132July 2003Stam
2003/0201929October 2003Lutter et al.
2003/0222982December 2003Hamdan et al.
2003/0227777December 2003Schofield
2004/0012488January 2004Schofield
2004/0016870January 2004Pawlicki et al.
2004/0032321February 2004McMahon et al.
2004/0051634March 2004Schofield et al.
2004/0114381June 2004Salmeen et al.
2004/0128065July 2004Taylor et al.
2004/0200948October 2004Bos et al.
2004/0227663November 2004Suzuki
2004/0246167December 2004Kumon
2005/0046978March 2005Schofield
2005/0078389April 2005Kulas et al.
2005/0102070May 2005Takahama
2005/0104089May 2005Engelmann
2005/0134966June 2005Burgner
2005/0134983June 2005Lynam
2005/0146792July 2005Schofield et al.
2005/0169003August 2005Lindahl et al.
2005/0195488September 2005McCabe et al.
2005/0200700September 2005Schofield et al.
2005/0232469October 2005Schofield et al.
2005/0264891December 2005Uken et al.
2005/0270225December 2005Tokoro
2006/0018511January 2006Stam et al.
2006/0018512January 2006Stam et al.
2006/0028731February 2006Schofield et al.
2006/0050018March 2006Hutzel et al.
2006/0067378March 2006Rege et al.
2006/0091654May 2006De Mersseman et al.
2006/0091813May 2006Stam et al.
2006/0103727May 2006Tseng
2006/0157639July 2006Shaffer
2006/0164230July 2006DeWind et al.
2006/0250501November 2006Wildmann et al.
2007/0023613February 2007Schofield et al.
2007/0088488April 2007Reeves et al.
2007/0104476May 2007Yasutomi et al.
2007/0109406May 2007Schofield et al.
2007/0109651May 2007Schofield et al.
2007/0109652May 2007Schofield et al.
2007/0109653May 2007Schofield et al.
2007/0109654May 2007Schofield et al.
2007/0120657May 2007Schofield et al.
2007/0152152July 2007Deasy
2007/0176080August 2007Schofield et al.
2008/0117097May 2008Walter et al.
2008/0180529July 2008Taylor et al.
2009/0113509April 2009Tseng et al.
2010/0001897January 2010Lynam
2010/0045797February 2010Schofield et al.
2011/0037640February 2011Schmidlin

Зарубежные патентные документы

1506893Feb 2005EP
H05301541Nov 1993JP
H08276787Oct 1996JP
H10147178Jun 1998JP
H1178737Mar 1999JP
2001158284Jun 2001JP
2001233139Aug 2001JP
2003044995Feb 2003JP
2003169233Jun 2003JP
2004082829Mar 2004JP
WO2003053743Jul 2003WO
WO2006035510Apr 2006WO

Другие источники


European Search Report for European Patent Application No. 08780377.1 dated Jun. 7, 2010. cited by applicant .
Bombini et al., "Radar-vision fusion for vehicle detection" Dipartimento di Ingegneria dell'Informazione Universit_a di Parma Parma I-43100 Italy Mar. 14, 2006. cited by applicant .
PCT International Search Report of PCT/US2008/051833 dated Oct. 7, 2008. cited by applicant .
European Examination Report for EP Application No. 08780377.1 dated Aug. 8, 2012. cited by applicant.

Главный эксперт: Bythrow; Peter M
Уполномоченный, доверенный или фирма: Honigman Miller Schwartz and Cohn, LLP

Текст решения-прецедента





ПЕРЕКРЕСТНЫЕ ССЫЛКИ НА РОДСТВЕННЫЕ ЗАЯВКИ



This application is a continuation of U.S. patent application Ser. No. 15/149,338, filed May 9, 2016, now U.S. Pat. No. 9,507,021, which is a continuation of U.S. patent application Ser. No. 15/005,092, filed Jan. 25, 2016, now U.S. Pat. No. 9,335,411, which is a continuation of U.S. patent application Ser. No. 14/859,683, filed Sep. 21, 2015, now U.S. Pat. No. 9,244,165, which is a continuation of U.S. patent application Ser. No. 14/107,624, filed Dec. 16, 2013, now U.S. Pat. No. 9,140,789, which is a divisional of U.S. patent application Ser. No. 13/656,975, filed Oct. 22, 2012, now U.S. Pat. No. 8,614,640, which is a continuation of U.S. patent application Ser. No. 13/540,856, filed Jul. 3, 2012, now U.S. Pat. No. 8,294,608, which is a continuation of U.S. patent application Ser. No. 13/192,525, filed Jul. 28, 2011, now U.S. Pat. No. 8,217,830, which is a continuation of U.S. patent application Ser. No. 12/524,446, filed Jul. 24, 2009, now U.S. Pat. No. 8,013,780, which is a 371 application of PCT Application No. PCT/US2008/051833, filed Jan. 24, 2008, which claims the benefit of U.S. provisional application Ser. No. 60/886,568, filed Jan. 25, 2007, which are incorporated herein by reference for all purposes.

ФОРМУЛА ИЗОБРЕТЕНИЯ



The invention claimed is:

1. A forward facing sensing system for a vehicle, the vehicle having a windshield, said forward facing sensing system comprising: a radar sensor disposed within an interior cabin of a vehicle equipped with said forward facing sensing system; wherein said radar sensor has a sensing direction forward of the equipped vehicle; wherein said radar sensor transmits at a frequency of at least 60 GHz; wherein said radar sensor utilizes digital beam forming in a phased array antenna system; an image sensor disposed within the interior cabin of the equipped vehicle behind the vehicle windshield; wherein said image sensor views forward of the equipped vehicle through the vehicle windshield; wherein said image sensor comprises a pixelated imaging array sensor comprising a plurality of photo-sensing pixels; wherein said radar sensor and said image sensor are housed in a self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle; wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle and is removable therefrom as a unit for at least one of service and replacement; a control comprising an image processor, said image processor operable to analyze image data captured by said image sensor in order to, at least in part, detect an object present exterior of the equipped vehicle in a direction of forward travel of the equipped vehicle; wherein said image processor comprises an image processing chip that processes image data captured by said image sensor utilizing object detection software; wherein said control, responsive at least in part to processing of captured image data by said image processor and to sensing by said radar sensor, determines that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle, and wherein said potentially hazardous condition comprises the detected object being in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor collaborate in a way that enhances sensing capability of said sensing system for the object in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor share, at least in part, common circuitry; wherein said control at least in part controls an adaptive cruise control system of the equipped vehicle; and wherein said image processor processes image data captured by said image sensor for an automatic headlamp control system of the equipped vehicle.

2. The sensing system of claim 1, wherein, responsive to determination by said control that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle and that said potentially hazardous condition comprises the detected object in the path of forward travel of the equipped vehicle, at least one of a visual alert and an audible alert is provided to alert a driver of the equipped vehicle to the determined potentially hazardous condition.

3. The sensing system of claim 2, wherein video images captured by said image sensor are displayed on a video display screen viewable to a driver of the equipped vehicle, and wherein, responsive to determination by said control that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle and that said potentially hazardous condition comprises a pedestrian in the path of forward travel of the equipped vehicle, a visual alert is displayed by said video display screen to alert the driver of the equipped vehicle to the determined potentially hazardous condition.

4. The sensing system of claim 3, wherein said image sensor is part of a night vision system of the equipped vehicle.

5. The sensing system of claim 2, wherein said at least one of a visual alert and an audible alert is provided episodically.

6. The sensing system of claim 1, wherein said image processor is operable to undertake enhanced processing of pixel outputs of pixels of said image sensor that are within a sub-array of said pixelated imaging array.

7. The sensing system of claim 6, wherein an object associated with said potentially hazardous condition is detected at least at said sub-array of said pixelated imaging array.

8. The sensing system of claim 7, wherein the detected object is a pedestrian.

9. The sensing system of claim 1, wherein the detected object is a pedestrian.

10. The sensing system of claim 1, wherein the detected object is a deer.

11. The sensing system of claim 1, wherein the vehicle windshield comprises an opaque layer generally where said self-contained unit is disposed behind the vehicle windshield, and wherein said opaque layer comprises a light-transmitting aperture, and wherein, with said self-contained unit attached at the vehicle windshield, said image sensor views forward of the equipped vehicle via said light-transmitting aperture.

12. The sensing system of claim 11, wherein said radar sensor transmits at a frequency of at least 77 GHz.

13. The sensing system of claim 11, wherein said radar sensor transmits at 79 GHz.

14. The sensing system of claim 11, wherein said image sensor views through the vehicle windshield at a location that is at or near where an interior rearview mirror assembly of the equipped vehicle is mounted at the vehicle windshield.

15. The sensing system of claim 14, wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle separate and spaced from where the interior rearview mirror assembly of the equipped vehicle mounts at the vehicle windshield.

16. The sensing system of claim 1, wherein said image processor and said common circuitry are housed in said self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle.

17. The sensing system of claim 16, wherein said image processor processes image data captured by said image sensor for a lane departure warning system of the equipped vehicle.

18. The sensing system of claim 1, wherein said common circuitry includes said image processor.

19. A forward facing sensing system for a vehicle, the vehicle having a windshield, said forward facing sensing system comprising: a radar sensor disposed within an interior cabin of a vehicle equipped with said forward facing sensing system; wherein said radar sensor has a sensing direction forward of the equipped vehicle; wherein said radar sensor transmits at a frequency of at least 60 GHz; wherein said radar sensor utilizes digital beam forming in a phased array antenna system; an image sensor disposed within the interior cabin of the equipped vehicle behind the vehicle windshield; wherein said image sensor views forward of the equipped vehicle through the vehicle windshield; wherein said image sensor comprises a pixelated imaging array sensor comprising a plurality of photo-sensing pixels; wherein said radar sensor and said image sensor are housed in a self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle; wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle and is removable therefrom as a unit for at least one of service and replacement; a control comprising an image processor, said image processor operable to analyze image data captured by said image sensor in order to, at least in part, detect an object present exterior of the equipped vehicle in a direction of forward travel of the equipped vehicle; wherein said image processor comprises an image processing chip that processes image data captured by said image sensor utilizing object detection software; wherein said control, responsive at least in part to processing of captured image data by said image processor and to sensing by said radar sensor, determines that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle, and wherein said potentially hazardous condition comprises the detected object being in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor collaborate in a way that enhances determination of existence of the potentially hazardous condition in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor share, at least in part, common circuitry; wherein said image sensor views through the vehicle windshield at a location that is at or near where an interior rearview mirror assembly of the equipped vehicle is mounted at the vehicle windshield; wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle separate and spaced from where the interior rearview mirror assembly of the equipped vehicle mounts at the vehicle windshield; wherein the vehicle windshield comprises an opaque layer generally where said self-contained unit is disposed behind the vehicle windshield, and wherein said opaque layer comprises a light-transmitting aperture; wherein, with said self-contained unit attached at the vehicle windshield, said image sensor views forward of the equipped vehicle via said light-transmitting aperture; and wherein, responsive to determination by said control that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle and that said potentially hazardous condition comprises an object in the path of forward travel of the equipped vehicle, at least one of a visual alert and an audible alert is provided to alert a driver of the equipped vehicle to the determined potentially hazardous condition.

20. The sensing system of claim 19, wherein said image processor and said common circuitry are housed in said self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle.

21. The sensing system of claim 20, wherein said common circuitry includes said image processor.

22. The sensing system of claim 21, wherein said control at least in part controls an adaptive cruise control system of the equipped vehicle.

23. The sensing system of claim 21, wherein said image processor processes image data captured by said image sensor for an automatic headlamp control system of the equipped vehicle.

24. A forward facing sensing system for a vehicle, the vehicle having a windshield, said forward facing sensing system comprising: a radar sensor disposed within an interior cabin of a vehicle equipped with said forward facing sensing system; wherein said radar sensor has a sensing direction forward of the equipped vehicle; wherein said radar sensor transmits at a frequency of at least 60 GHz; wherein said radar sensor utilizes digital beam forming in a phased array antenna system; an image sensor disposed within the interior cabin of the equipped vehicle behind the vehicle windshield; wherein said image sensor views forward of the equipped vehicle through the vehicle windshield; wherein said image sensor comprises a pixelated imaging array sensor comprising a plurality of photo-sensing pixels; wherein said radar sensor and said image sensor are housed in a self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle; wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle and is removable therefrom as a unit for at least one of service and replacement; a control comprising an image processor, said image processor operable to analyze image data captured by said image sensor in order to, at least in part, detect a pedestrian present exterior of the equipped vehicle in a direction of forward travel of the equipped vehicle; wherein said image processor comprises an image processing chip that processes image data captured by said image sensor utilizing object detection software; wherein said control, responsive at least in part to processing of captured image data by said image processor and to sensing by said radar sensor, determines that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle, and wherein said potentially hazardous condition comprises the detected pedestrian being in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor collaborate in a way that enhances determination of existence of the potentially hazardous condition in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor collaborate in a way that enhances sensing capability of said sensing system for the pedestrian in the path of forward travel of the equipped vehicle; wherein said radar sensor and said image sensor share, at least in part, common circuitry; wherein said image processor and said common circuitry are housed in said self-contained unit disposed behind and attached at the vehicle windshield of the equipped vehicle; wherein said common circuitry includes said image processor; wherein said image sensor views through the vehicle windshield at a location that is at or near where an interior rearview mirror assembly of the equipped vehicle is mounted at the vehicle windshield; wherein said self-contained unit mounts at the vehicle windshield of the equipped vehicle separate and spaced from where the interior rearview mirror assembly of the equipped vehicle mounts at the vehicle windshield; wherein the vehicle windshield comprises an opaque layer generally where said self-contained unit is disposed behind the vehicle windshield, and wherein said opaque layer comprises a light-transmitting aperture; and wherein, with said self-contained unit attached at the vehicle windshield, said image sensor views forward of the equipped vehicle via said light-transmitting aperture.

25. The sensing system of claim 24, wherein, responsive to determination by said control that a potentially hazardous condition may exist in the path of forward travel of the equipped vehicle and that said potentially hazardous condition comprises an object in the path of forward travel of the equipped vehicle, at least one of a visual alert and an audible alert is provided to alert a driver of the equipped vehicle to the determined potentially hazardous condition.

26. The sensing system of claim 25, wherein said control at least in part controls an adaptive cruise control system of the equipped vehicle.

27. The sensing system of claim 25, wherein said image processor processes image data captured by said image sensor for an automatic headlamp control system of the equipped vehicle.


ОПИСАНИЕ




ОБЛАСТЬ ТЕХНИКИ, К КОТОРОЙ ОТНОСИТСЯ ИЗОБРЕТЕНИЕ



The present invention generally relates to forward facing sensing systems and, more particularly, to forward facing sensing systems utilizing a radar sensor device.


ПРЕДПОСЫЛКИ СОЗДАНИЯ ИЗОБРЕТЕНИЯ



It is known to provide a radar (radio detection and ranging) system (such as a 77 GHz radar or other suitable frequency radar) on a vehicle for sensing the area forward of a vehicle, such as for an adaptive cruise control (ACC) system or an ACC stop and go system or the like. It is also known to provide a lidar (laser imaging detection and ranging) system for sensing the area forward of a vehicle for similar applications. Typically, the radar system is preferred for such vehicle applications because of its ability to detect better than the lidar system in fog or other inclement weather conditions.

Typically, such radar sensor devices are often located at the front grille of the vehicle and thus may be intrusive to the underhood packaging of the vehicle and the exterior styling of the vehicle. Although it is known to provide a lidar sensing device or system at the windshield for scanning/detecting through the windshield, radar systems are typically not suitable for such applications, since they typically are not suitable for viewing through glass, such as through the vehicle windshield (because the glass windshield may substantially attenuate the radar performance or ability to detect objects forward of the vehicle). It is also known to augment such a radar or lidar system with a forward facing camera or image sensor.


СУЩНОСТЬ ИЗОБРЕТЕНИЯ



The present invention provides a forward facing sensing system for detecting objects forward of the vehicle (such as for use with or in conjunction with an adaptive cruise control system or other object detection system or the like), with a radar sensor device being located behind, and transmitting through [typically, transmitting at at least about 20 GHz frequency (such as 24 GHz) and more preferably at least about 60 GHz frequency (such as 60 GHz or 77 GHz or 79 GHz or thereabouts)], a radar transmitting portion established at the upper windshield area of the vehicle. The radar sensor device is positioned at a recess or pocket or opening formed at and along the upper edge of the windshield so as to have a forward transmitting and receiving direction for radar electromagnetic waves that is not through the glass panels of the windshield. The vehicle or sensing system preferably includes a sealing or cover element, such as a plastic cover element at the sensing device to seal/environmentally protect the radar sensor device within the cabin of the vehicle while allowing for transmission of and receipt of radar frequency electromagnetic radiation waves to and from the exterior of the vehicle.

According to an aspect of the present invention, a forward facing sensing system or radar sensing system for a vehicle includes a radar sensor device disposed at a pocket or recess or opening established at an upper edge of the vehicle windshield and having a forward transmitting and receiving direction that is not through the windshield. A cover panel is disposed at the radar sensor device and is substantially sealed at the vehicle windshield at or near the pocket at the upper edge of the vehicle windshield. The cover panel comprises a material that is substantially transmissive to radar frequency electromagnetic radiation waves. The radar sensor device transmits and receives radar frequency electromagnetic radiation waves that transmit through the cover panel. The system includes a control that is responsive to an output of the radar sensor device.

According to another aspect of the present invention, a forward facing sensing system for a vehicle includes a radar sensor device operable to detect an object ahead of the vehicle, a forward facing image sensor having a forward field of view, and a control responsive to an output of the radar sensor device and responsive to an output of the forward facing image sensor. The control is operable to control sensing by the radar sensor device and the control is operable to control a focused or enhanced interrogation of a detected object (or area at which a detected object is detected) in response to a detection of an object forward of the vehicle by the radar sensor device. The control may be operable to at least one of (a) control enhanced interrogation of a detected object by the radar sensor device in response to the forward facing image sensor detecting an object (such as by enhancing the interrogation via a beam aiming or beam selection technique, such as by digital beam forming in a phased array antenna system or such as by digital beam steering or the like), and (b) control enhanced interrogation of a detected object by the forward facing image sensor in response to the radar sensor device detecting an object (such as by enhancing the interrogation via enhanced or intensified algorithmic processing of a portion of the image plane of the image sensor that is spatially related to the location of the detected object in the forward field of view of the image sensor). The control thus may be responsive to the forward facing image sensor to guide or control the focused interrogation of the detected object by the radar sensor device, or the control may be responsive to the radar sensor device to guide or control the focused or enhanced interrogation of the detected object by the forward facing image sensor (such as via directing or controlling the image sensor and/or its field of view or zoom function or via image processing of the captured image data, such as by providing enhanced processing of the area at which the object is detected).

Optionally, and desirably, the forward facing image sensor and the radar sensor device may be commonly established on a semiconductor substrate. Optionally, the semiconductor substrate may comprise one of (i) a germanium substrate, (ii) a gallium arsenide substrate, and (iii) a silicon germanium substrate.

These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



FIG. 1 is a perspective view of a vehicle incorporating a forward facing radar sensing system in accordance with the present invention; and

FIG. 2 is a perspective view of a windshield and radar sensing system of the present invention.


ОПИСАНИЕ ПРЕДПОЧТИТЕЛЬНЫХ ВАРИАНТОВ ОСУЩЕСТВЛЕНИЯ



Referring now to the drawings and the illustrative embodiments depicted therein, a sensing system or forward facing sensing system or radar sensing system 10 for a vehicle 12 includes a radar sensor device 14 at an upper region of the vehicle windshield 12a and with a forward transmitting and sensing direction forward of the vehicle and in the forward direction of travel of the vehicle (FIG. 1). The windshield glass 12a may be formed with a cutout or pocket 12b at the upper edge. The pocket may be cut from the glass (so as to provide a cut opening at the upper edge of the glass windshield) or the glass may be formed with an inward bulge or pocket that provides an opening for the sensing device. The radar sensor device 14 thus may be disposed at the pocket 12b and may have a clear or unobstructed view or sensing direction forward of the vehicle that does not pass through glass (and whereby the glass windshield will not attenuate the performance of the radar sensor device). Because the upper region of the vehicle windshield is typically not used, the radar sensor device 14 may be disposed thereat without being intrusive of other systems or elements and without adversely affecting the vehicle design and/or layout. The sensing system 10 is operable to detect objects or vehicles or the like in front of the vehicle as the vehicle is traveling along a road, such as in conjunction with an adaptive cruise control system or the like. Although shown and described as being a forward facing sensing system, aspects of the present invention may be suitable for other sensing systems, such as a rearward facing sensing system or the like.

Radar sensor device 14 thus may be disposed within a windshield electronics module 16 or accessory module or overhead console of the vehicle, and within the vehicle cabin, without experiencing the adverse performance caused by the attenuation of radio or radar frequency electromagnetic radiation wave transmission through the windshield glass. Optionally, the vehicle sheet metal may be adapted to receive and/or support the radar sensor device at the upper edge of the windshield, or to accommodate the radar sensor device as disposed in and/or supported by the windshield electronics module or the like.

In order to seal the upper edge of the windshield at the pocket 12b, a cover element or plate 18 may be provided that substantially or entirely spans the opening at the pocket and that is sealed at the glass windshield and vehicle around the perimeter of the pocket, so as to limit or substantially preclude water intrusion or the like into the vehicle at the radar sensor device. The cover element 18 preferably comprises a plastic or polymeric or polycarbonate material that is transmissive to radar waves so as to limit or substantially preclude an adverse effect on the performance of the radar sensor device and system. Optionally, and desirably, the cover element may be colored to match or substantially match the shade band along the upper region of the windshield or to match or substantially match the windshield electronics module or other interior or exterior component of the vehicle. Because the radar sensor device does not require a transparent cover, the cover element may be opaque or substantially opaque and/or may function to substantially camouflage or render covert the sensor device and/or the windshield electronics module or the like.

The radar sensor device may utilize known transmitting and receiving technology and may utilize a sweeping beam or a phased array or the like for scanning or sensing or interrogating the area in front of the vehicle. Optionally, the forward facing radar sensing system may include or may be associated with a forward facing camera or imaging sensor 20 (which may be disposed at or in the windshield electronics module or accessory module or overhead console or at another accessory module or windshield electronics module or at the interior rearview mirror assembly 22 or the like), which has a forward field of view in the forward direction of travel of the vehicle. The sensing system may function to perform a "sweep" of the area in front of the vehicle and if an object or the like is detected (e.g., the radar sensing system detects a "blip"), the radar sensor device and system may hone in on or focus on or further interrogate the region where the object is detected and may perform a more focused or enhanced interrogation of the area at which the object was detected to determine if the object is an object of interest. Optionally, for example, the system may control enhanced interrogation of a detected object by the radar sensor device (such as a beam aiming or beam selection technique, such as by digital beam forming in a phased array antenna system or such as by digital beam steering). Such enhanced interrogation by the radar sensor device may be in response to the forward facing image sensor detecting an object in its forward field of view.

Optionally, and desirably, the forward facing camera may guide or initiate or control the more focused interrogation of the suspected object of interest (such as further or enhanced interrogation by the camera and imaging system) in response to the initial detection by the radar sensing system. For example, the radar sensing system may initially detect an object and the forward facing camera may be directed toward the detected object or otherwise controlled or processed to further interrogate the detected object (or area at which the object is detected) via the camera and image processing, or, alternately, the forward facing camera may initially detect an object and the system may select or aim a radar beam in a direction of a detected object. The enhanced interrogation of the object area by the forward facing camera may be accomplished via control of the camera's field of view or degree of zoom [for example, the camera may zoom into the area (via adjustment of a lens of the camera to enlarge an area of the field of view for enhanced processing) at which the object is detected] or via control of the image processing techniques. For example, the image processor may provide enhanced processing of the captured image data at the area or zone at which the object is detected, such as by enhanced or intensified algorithmic processing of a portion of the image plane of the image sensor that is spatially related to the location of the detected object in the forward field of view of the image sensor, such as by enhanced processing of pixel outputs of pixels within a zone or sub-array of a pixelated imaging array sensor, such as by utilizing aspects of the imaging systems described in U.S. Pat. Nos. 7,123,168; 7,038,577; 7,004,606; 6,690,268; 6,396,397; 5,550,677; 5,670,935; 5,796,094; 5,877,897 and 6,498,620, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, which are all hereby incorporated herein by reference in their entireties.

Thus, the sensing system of the present invention provides for cooperation or collaboration between the radar sensor device and the forward facing camera or image sensor in a way that benefits or enhances the sensing capabilities of the forward facing sensing system. The sensing system may thus operate with reduced processing until an object is initially detected, and then may provide further processing to determine if the object is an object of interest to the forward facing sensing system.

Optionally, and desirably, the radar sensor device and forward facing camera may be commonly established on a semiconductor substrate, such as a substrate comprising a germanium substrate, a gallium arsenide substrate or a silicon germanium substrate or the like. The substrate may include or may incorporate at least some of the control circuitry for the radar sensor device and camera and/or may include or incorporate common circuitry for the radar sensor device and camera.

Because the radar sensor device and camera may be disposed on a common substrate and/or may be disposed within a windshield electronics module or the like, the forward facing sensing system may be removably installed at the vehicle and may be removed therefrom, such as for service or replacement. Thus, the sensing system (including the radar sensor device and camera) may comprise a self-contained unit or system that is disposed at the upper region of the windshield. Optionally, the radar sensor device and/or camera may be disposed within a windshield electronics module or the like, such as by utilizing aspects of the modules described in U.S. patent application Ser. No. 10/958,087, filed Oct. 4, 2004, now U.S. Pat. No. 7,188,963; and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. No. 7,480,149, and/or U.S. Pat. Nos. 7,004,593; 6,824,281; 6,690,268; 6,250,148; 6,341,523; 6,593,565; 6,428,172; 6,501,387; 6,329,925 and 6,326,613, and/or in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or Ireland pat. applications, Ser. No. S2004/0614, filed Sep. 15, 2004; Ser. No. S2004/0838, filed Dec. 14, 2004; and Ser. No. S2004/0840, filed Dec. 15, 2004, which are all hereby incorporated herein by reference in their entireties.

Optionally, the mirror assembly and/or windshield electronics module may include or incorporate a display, such as a static display, such as a static video display screen (such as a display utilizing aspects of the displays described in U.S. Pat. Nos. 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference in their entireties, or a display-on-demand or transflective type display or other display utilizing aspects of the displays described in U.S. Pat. Nos. 6,690,268; 5,668,663 and/or 5,724,187, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005 and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, published May 10, 2007 as International Publication No. WO 07/053710; and U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and published Jul. 15, 2004 as International Publication No. WO 2004/058540, which are all hereby incorporated herein by reference in their entireties). Alternately, the display screen may comprise a display (such as a backlit LCD video display) that is movable to extend from the mirror casing when activated, such as a slide-out display of the types described in U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, and published May 10, 2007 as International Publication No. WO 07/053710; and U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, which are all hereby incorporated herein by reference in their entireties. Optionally, and preferably, the display is episodically extended and/or actuated, such as to display driving instructions to the driver as the vehicle approaches a waypoint or turn along the selected route, and then retracted after the vehicle has passed the waypoint and continues along the selected route.

Optionally, the display on the video screen may be operable to display an alert to the driver of a potential hazardous condition detected ahead of or in the forward path of the vehicle. For example, an output of a forward-viewing active night vision system incorporating an imaging sensor or camera device and near-IR floodlighting (such as those described in U.S. Pat. No. 5,877,897 and U.S. patent application Ser. No. 11/651,726, filed Jan. 10, 2007, now U.S. Pat. No. 7,311,406, which are hereby incorporated herein by reference in their entireties), or an output of another suitable forward facing sensor or system such a passive far-IR thermal imaging night vision sensor/camera, may be processed by an image processor, such as, for example, an EyeQ.TM. image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel. Such image processors include object detection software (such as the types described in U.S. Pat. No. 7,038,577; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, which are hereby incorporated herein by reference in their entireties), and they analyze image data to detect objects. The image processor or control may determine if a potentially hazardous condition (such as an object or vehicle or person or deer or the like) may exist in the vehicle path and may provide an alert signal (such as by actuation of a visual indicator or an audible indicator or by an enhancement/overlay on a video display screen that is showing a video image to the driver of what the night vision sensor/camera is seeing) to prompt/alert the driver of a potential hazard (such as a deer or a pedestrian or a fallen rock or the like) as needed or appropriate. The display thus may provide an episodal alert so that the driver's attention is drawn to the display alert only when there is a potential hazard detected. Such a system avoids the driver from having to look forward out the windshield while often looking to or watching a monitor running a video of the camera's output, which is not particularly consumer-friendly and simply loads the driver with yet another task.

Optionally, the mirror reflective element of the mirror assembly may comprise a prismatic mirror reflector or an electrically variable reflectance mirror reflector, such as an electro-optic reflective element assembly or cell, such as an electrochromic reflective element assembly or cell. For example, the rearview mirror assembly may comprise an electro-optic or electrochromic reflective element or cell, such as an electrochromic mirror assembly and electrochromic reflective element utilizing principles disclosed in commonly assigned U.S. Pat. Nos. 6,690,268; 5,140,455; 5,151,816; 6,178,034; 6,154,306; 6,002,544; 5,567,360; 5,525,264; 5,610,756; 5,406,414; 5,253,109; 5,076,673; 5,073,012; 5,117,346; 5,724,187; 5,668,663; 5,910,854; 5,142,407; 4,824,221; 5,818,636; 6,166,847; 6,111,685; 6,392,783; 6,710,906; 6,798,556; 6,554,843; 6,420,036; 5,142,406; 5,442,478 and/or 4,712,879, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005, and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or International Pat. Publication Nos. WO 2004/098953, published Nov. 18, 2004; WO 2004/042457, published May 21, 2004; WO 2003/084780, published Oct. 16, 2003; and/or WO 2004/026633, published Apr. 1, 2004, which are all hereby incorporated herein by reference in their entireties, and/or such as disclosed in the following publications: N. R. Lynam, "Electrochromic Automotive Day/Night Mirrors", SAE Technical Paper Series 870636 (1987); N. R. Lynam, "Smart Windows for Automobiles", SAE Technical Paper Series 900419 (1990); N. R. Lynam and A. Agrawal, "Automotive Applications of Chromogenic Materials", Large Area Chromogenics: Materials and Devices for Transmittance Control, C. M. Lampert and C. G. Granquist, EDS., Optical Engineering Press, Wash. (1990), which are hereby incorporated herein by reference in their entireties.

Optionally, and preferably, the mirror reflective element may comprise a frameless reflective element, such as by utilizing aspects of the reflective elements described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published on Dec. 2, 2004, as International Publication No. WO 2004/103772; and/or U.S. patent application Ser. No. 11/140,396, filed May 27, 2005, now U.S. Pat. No. 7,360,932; Ser. No. 11/226,628, filed Sep. 14, 2005, and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; and/or Ser. No. 10/538,724, filed Jun. 13, 2005, and published on Mar. 9, 2006 as U.S. Publication No. US 2006/0050018, which are hereby incorporated herein by reference in their entireties. Optionally, the reflective element may include a metallic perimeter band around the perimeter of the reflective element, such as by utilizing aspects of the reflective elements described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; PCT Application No. PCT/US03/29776, filed Sep. 19, 2003 and published Apr. 1, 2004 as International Publication No. WO 2004/026633; and/or PCT Application No. PCT/US03/35381, filed Nov. 5, 2003 and published May 21, 2004 as International Publication No. WO 2004/042457; and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US 2006/0061008, which is hereby incorporated herein by reference in their entireties. The frameless reflective element thus is aesthetically pleasing to a person viewing the mirror assembly, since the reflective element (as recessed or partially recessed in the opening of the bezel portion of the mirror casing) does not include a separate frame or bezel portion around its perimeter edge. The metallic perimeter band may be selected to have a desired color or tint to match or contrast a color scheme or the like of the vehicle, such as described in PCT Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, which are hereby incorporated herein by reference in their entireties.

Optionally, use of an elemental semiconductor mirror, such as a silicon metal mirror, such as disclosed in U.S. Pat. Nos. 6,286,965; 6,196,688; 5,535,056; 5,751,489 and 6,065,840, and/or in U.S. patent application Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, which are all hereby incorporated herein by reference in their entireties, can be advantageous because such elemental semiconductor mirrors (such as can be formed by depositing a thin film of silicon) can be greater than 50 percent reflecting in the photopic (SAE J964a measured), while being also substantially transmitting of light (up to 20 percent or even more). Such silicon mirrors also have the advantage of being able to be deposited onto a flat glass substrate and to be bent into a curved (such as a convex or aspheric) curvature, which is also advantageous since many passenger-side exterior rearview mirrors are bent or curved.

Optionally, the mirror assembly may comprise a prismatic mirror assembly, such as a prismatic mirror assembly utilizing aspects described in U.S. Pat. Nos. 6,318,870; 6,598,980; 5,327,288; 4,948,242; 4,826,289; 4,436,371 and 4,435,042, and PCT Application No. PCT/US04/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772; and U.S. patent application Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference in their entireties. Optionally, the prismatic reflective element may comprise a conventional prismatic reflective element or prism, or may comprise a prismatic reflective element of the types described in PCT Application No. PCT/US03/29776, filed Sep. 19, 2003 and published Apr. 1, 2004 as International Publication No. WO 2004/026633; and/or U.S. patent application Ser. No. 10/709,434, filed May 5, 2004, now U.S. Pat. No. 7,420,756; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; and/or Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, which are all hereby incorporated herein by reference in their entireties, without affecting the scope of the present invention.

Optionally, the reflective element may comprise a bent, wide-angle mirror reflector rather than a flat mirror reflector. If a bent, wide-angle mirror reflector is used, it is preferable that the mirror reflector comprise a glass substrate coated with a bendable reflector coating (such as of silicon as described in U.S. Pat. Nos. 6,065,840; 5,959,792; 5,535,056 and 5,751,489, which are hereby incorporated by reference herein in their entireties.

Optionally, the mirror casing and/or windshield electronics module may be suitable for supporting larger or heavier components or circuitry that otherwise may not have been suitable for mounting or locating at or in a mirror casing. For example, the mirror casing or module may house or support a battery or power pack for various electronic features or components, and/or may support a docking station for docking and/or holding a cellular telephone or hand-held personal data device or the like, such as by utilizing aspects of the systems described in U.S. Pat. No. 6,824,281, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, and/or U.S. patent application Ser. No. 11/842,328, filed Aug. 21, 2007, now U.S. Pat. No. 7,722,199, and Ser. No. 11/861,904, filed Sep. 26, 2007, now U.S. Pat. No. 7,937,667, and/or U.S. provisional application, Ser. No. 60/839,446, filed Aug. 23, 2006; Ser. No. 60/879,619, filed Jan. 10, 2007; Ser. No. Ser. No. 60/850,700, filed Oct. 10, 2006; and/or Ser. No. 60/847,502, filed Sep. 27, 2006, which are hereby incorporated herein by reference in their entireties.

Optionally, the mirror assembly and/or windshield electronics module may include or incorporate a navigation device that may include navigational circuitry and a GPS antenna to determine the geographical location of the vehicle and to provide routes to targeted or selected destinations, such as by utilizing aspects of known navigational devices and/or the devices of the types described in U.S. Pat. Nos. 4,862,594; 4,937,945; 5,131,154; 5,255,442; 5,632,092; 5,798,688; 5,971,552; 5,924,212; 6,243,003; 6,278,377; 6,420,975; 6,946,978; 6,477,464; 6,678,614 and/or 7,004,593, and/or U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796; Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; Ser. No. 11/861,904, filed Sep. 26, 2007, now U.S. Pat. No. 7,937,667; and/or Ser. No. 10/964,512, filed Oct. 13, 2004, now U.S. Pat. No. 7,308,341, and/or U.S. provisional applications, Ser. No. 60/879,619, filed Jan. 10, 2007; Ser. No. Ser. No. 60/850,700, filed Oct. 10, 2006; and/or Ser. No. 60/847,502, filed Sep. 27, 2006, which are all hereby incorporated herein by reference in their entireties. Optionally, the mirror or navigation device may include a microphone, whereby the mirror or navigation device may provide voice activated control of the navigation device.

Optionally, for example, the mounting structure and/or mirror casing and/or windshield electronics module may support compass sensors, such as compass sensors of the types described in may utilize aspects of the compass systems described in U.S. patent application Ser. No. 11/305,637, filed Dec. 16, 2005, now U.S. Pat. No. 7,329,013; Ser. No. 10/352,691, filed Jan. 28, 2003, now U.S. Pat. No. 6,922,902; Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983; Ser. No. 11/226,628, filed Sep. 14, 2005, and published on Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; and/or U.S. Pat. Nos. 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and 6,642,851, and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, which are all hereby incorporated herein by reference in their entireties. The compass circuitry may include the compass sensor, such as a magneto-responsive sensor, such as a magneto-resistive sensor, such as the types disclosed in U.S. Pat. Nos. 5,255,442; 5,632,092; 5,802,727; 6,173,501; 6,427,349 and 6,513,252 (which are hereby incorporated herein by reference in their entireties), a magneto-capacitive sensor, a Hall-effect sensor, such as the types described in U.S. Pat. Nos. 6,278,271; 5,942,895 and 6,184,679 (which are hereby incorporated herein by reference in their entireties), a magneto-inductive sensor, such as described in U.S. Pat. No. 5,878,370 (which is hereby incorporated herein by reference in its entirety), a magneto-impedance sensor, such as the types described in PCT Publication No. WO 2004/076971, published Sep. 10, 2004 (which is hereby incorporated herein by reference in its entirety), or a flux-gate sensor or the like, and/or may comprise a compass chip, such as described in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005, and published on Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, which are hereby incorporated herein by reference in their entireties. By positioning the compass sensors at a fixed location, further processing and calibration of the sensors to accommodate adjustment or movement of the sensors is not necessary.

Optionally, the mounting structure and/or mirror casing and/or windshield electronics module may support one or more imaging sensors or cameras, and may fixedly support them with the cameras set with a desired or appropriate forward and/or rearward field of view. For example, the camera may be operable in conjunction with a forward facing imaging system, such as a rain sensing system, such as described in U.S. Pat. Nos. 6,968,736; 6,806,452; 6,516,664; 6,353,392; 6,313,454; 6,250,148; 6,341,523 and 6,824,281, and in U.S. patent application Ser. No. 10/958,087, filed Oct. 4, 2004, now U.S. Pat. No. 7,188,963; and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. No. 7,480,149, which are all hereby incorporated herein by reference in their entireties. The mounting structure and/or mirror casing may be pressed or loaded against the interior surface of the windshield to position or locate the image sensor in close proximity to the windshield and/or to optically couple the image sensor at the windshield. The mounting structure and/or mirror casing may include an aperture or apertures at its forward facing or mounting surface and the windshield may include apertures through the opaque frit layer (typically disposed at a mirror mounting location of a windshield) or the windshield may not include such a frit layer, depending on the particular application.

Optionally, the image sensor may be operable in conjunction with a forward or rearward vision system, such as an automatic headlamp control system and/or a lane departure warning system or object detection system and/or other forward vision or imaging systems, such as imaging or vision systems of the types described in U.S. Pat. Nos. 7,038,577; 7,005,974; 7,004,606; 6,690,268; 6,946,978; 6,757,109; 6,717,610; 6,396,397; 6,201,642; 6,353,392; 6,313,454; 5,550,677; 5,670,935; 5,796,094; 5,715,093; 5,877,897; 6,097,023 and 6,498,620, and/or U.S. patent application Ser. No. 09/441,341, filed Nov. 16, 1999, now U.S. Pat. No. 7,339,149; Ser. No. 10/422,512, filed Apr. 24, 2003, now U.S. Pat. No. 7,123,168; Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496; Ser. No. 11/672,070, filed Feb. 7, 2007, now U.S. Pat. No. 8,698,894; and/or Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/731,183, filed Oct. 28, 2005; and/or Ser. No. 60/765,797, filed Feb. 7, 2006, and/or International PCT Application No. PCT/US2006/041709, filed Oct. 27, 2006, and published May 10, 2007 as International Publication No. WO 07/053404, which are hereby incorporated herein by reference in their entireties. The mirror casing thus may support one or more rearward facing imaging sensors or cameras, such as for rearward vision or imaging systems, such as for a rear vision system or back up aid of the types described in U.S. Pat. Nos. 6,717,610 and/or 6,201,642 (which are hereby incorporated herein by reference in their entireties), and/or a cabin monitoring system or baby view system of the types described in U.S. Pat. No. 6,690,268 (which is hereby incorporated herein by reference in its entirety), and/or the like.

Optionally, the fixed mounting structure and/or mirror casing and/or windshield electronics module may house or support a display device, such as a heads up display device (such as the types described in U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; and Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723, which are hereby incorporated herein by reference in their entireties) that is operable to project a display at the area in front of the driver to enhance viewing of the display information without adversely affecting the driver's forward field of view. For example, the mirror casing may support a heads up display (HUD), such as a MicroHUD.TM. head-up display system available from MicroVision Inc. of Bothell, Wash., and/or such as a HUD that utilizes aspects described in U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; and Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723, which are hereby incorporated herein by reference in their entireties. For example, MicroVision's MicroHUD.TM. combines a MEMS-based micro display with an optical package of lenses and mirrors to achieve a compact high-performance HUD module that reflects a virtual image off the windscreen that appears to the driver to be close to the front of the car. This laser-scanning display can outperform many miniature flat panel LCD display screens because it can be clearly viewed in the brightest conditions and also dimmed to the very low brightness levels required for safe night-time driving. For example, such a display device may be located at or in the mirror casing/mounting structure/windshield electronics module and may be non-movably mounted at the mirror casing or mounting structure or windshield electronics module, and may be operable to project the display information at the windshield of the vehicle so as to be readily viewed by the driver of the vehicle in the driver's forward field of view.

The mounting structure and/or mirror casing and/or windshield electronics module may be fixedly attached to or supported at the vehicle windshield and may extend upward toward the headliner of the vehicle. Thus, the mirror assembly of the present invention may have enhanced wire management and may substantially conceal the wiring of the electronic components/accessories between the circuitry within the mirror casing and the headliner at the upper portion of the vehicle windshield. Optionally, the mirror assembly may include wire management elements, such as the types described in U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005, and published Mar. 23, 2006 as U.S. Publication No. 2006/0061008; and/or Ser. No. 11/584,697, filed Oct. 20, 2006, now U.S. Pat. No. 7,510,287; and/or U.S. provisional application, Ser. No. Ser. No. 60/729,430, filed Oct. 21, 2005, which are hereby incorporated herein by reference in their entireties, to conceal the wires extending between an upper portion of the mirror casing and the vehicle headliner (or overhead console). Optionally, the mirror casing and/or mounting structure and/or windshield electronics module may abut the headliner and/or may be an extension of an overhead console of the vehicle (such as by utilizing aspects described in U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, which are hereby incorporated herein by reference in their entireties). The mirror assembly of the present invention thus may allow for utilization of the area above the mirror reflective element for additional mirror content, such as additional electronic accessories or circuitry, and thus may provide for or accommodate additional mirror content/circuitry and/or vehicle content/circuitry.

Optionally, the mirror assembly and/or reflective element assembly may include one or more displays, such as for the accessories or circuitry described herein. The displays may comprise any suitable display, such as displays of the types described in U.S. Pat. Nos. 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference in their entireties, or may be display-on-demand or transflective type displays or other displays, such as the types described in U.S. Pat. Nos. 6,690,268; 5,668,663 and/or 5,724,187, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. 2006/0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006 and published Nov. 23, 2006 as International Publication No. WO 2006/124682; and/or PCT Application No. PCT/US2006/042718, filed Oct. 31, 2006, and published May 10, 2007 as International Publication No. WO 07/053710; and/or U.S. provisional applications, Ser. No. 60/836,219, filed Aug. 8, 2006; Ser. No. 60/759,992, filed Jan. 18, 2006; and Ser. No. 60/732,245, filed Nov. 1, 2005, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, which are all hereby incorporated herein by reference in their entireties, or may include or incorporate video displays or the like, such as the types described in U.S. Pat. No. 6,690,268 and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005, and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, which are hereby incorporated herein by reference in their entireties. Optionally, the mirror assembly may include a video display that is selectively positionable, such as extendable/retractable or pivotable or foldable so as to be selectively positioned at a side or below the mirror casing when in use and storable within or at least partially within the mirror casing when not in use. The display may automatically extend/pivot to the in-use position in response to an actuating event, such as when the vehicle is shifted into its reverse gear for a rear vision system or back up aid.

Such a video mirror display (or other display) may be associated with a rearward facing camera at a rear of the vehicle and having a rearward field of view, such as at the license plate holder of the vehicle or at a rear trim portion (such as described in U.S. patent application Ser. No. 11/672,070, filed Feb. 7, 2007, now U.S. Pat. No. 8,698,894, and U.S. provisional application Ser. No. 60/765,797, filed Feb. 7, 2006, which is hereby incorporated herein by reference in its entirety). The image data captured by the rearward facing camera may be communicated to the control or video display at the rearview mirror assembly (or elsewhere in the vehicle, such as at an overhead console or accessory module or the like) via any suitable communication means or protocol. For example, the image data may be communicated via a fiber optic cable or a twisted pair of wires, or may be communicated wirelessly, such as via a BLUETOOTH.RTM. communication link or protocol or the like, or may be superimposed on a power line, such as a 12 volt power line of the vehicle, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, which is hereby incorporated herein by reference in its entirety.

Optionally, the mirror assembly may include one or more user inputs for controlling or activating/deactivating one or more electrical accessories or devices of or associated with the mirror assembly. For example, the mirror assembly may comprise any type of switches or buttons, such as touch or proximity sensing switches, such as touch or proximity switches of the types described in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540; and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or U.S. Pat. Nos. 6,001,486; 6,310,611; 6,320,282 and 6,627,918, and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or U.S. patent application Ser. No. 09/817,874, filed Mar. 26, 2001, now U.S. Pat. No. 7,224,324; Ser. No. 10/956,749, filed Oct. 1, 2004, now U.S. Pat. No. 7,446,924; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/140,396, filed May 27, 2005, now U.S. Pat. No. 7,360,932, which are hereby incorporated herein by reference in their entireties, or the inputs may comprise other types of buttons or switches, such as those described in U.S. Pat. No. 6,501,387, and/or U.S. patent application Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723; and/or Ser. No. 11/451,639, filed Jun. 13, 2006, now U.S. Pat. No. 7,527,403, which are hereby incorporated herein by reference in their entireties, or such as fabric-made position detectors, such as those described in U.S. Pat. Nos. 6,504,531; 6,501,465; 6,492,980; 6,452,479; 6,437,258 and 6,369,804, which are hereby incorporated herein by reference in their entireties. Other types of switches or buttons or inputs or sensors may be incorporated to provide the desired function, without affecting the scope of the present invention. The manual inputs or user actuatable inputs or actuators may control or adjust or activate/deactivate one or more accessories or elements or features. For touch sensitive inputs or applications or switches, the mirror assembly or accessory module or input may, when activated, provide a positive feedback (such as activation of an illumination source or the like, or such as via an audible signal, such as a chime or the like, or a tactile or haptic signal, or a rumble device or signal or the like) to the user so that the user is made aware that the input was successfully activated.

Optionally, the user inputs or buttons may comprise user inputs for a garage door opening system, such as a vehicle based garage door opening system of the types described in U.S. Pat. Nos. 7,023,322; 6,396,408; 6,362,771 and 5,798,688, which are hereby incorporated herein by reference in their entireties. The user inputs may also or otherwise function to activate and deactivate a display or function or accessory, and/or may activate/deactivate and/or commence a calibration of a compass system of the mirror assembly and/or vehicle. Optionally, the user inputs may also or otherwise comprise user inputs for a telematics system of the vehicle, such as, for example, an ONSTAR.RTM. system as found in General Motors vehicles and/or such as described in U.S. Pat. Nos. 4,862,594; 4,937,945; 5,131,154; 5,255,442; 5,632,092; 5,798,688; 5,971,552; 5,924,212; 6,243,003; 6,278,377; 6,420,975; 6,946,978; 6,477,464; 6,678,614 and/or 7,004,593, and/or U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796; Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. 2006/0050018; and/or Ser. No. 10/964,512, filed Oct. 13, 2004, now U.S. Pat. No. 7,308,341, which are all hereby incorporated herein by reference in their entireties.

Optionally, the display and inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 6,877,888; 6,690,268; 6,824,281; 6,672,744; 6,386,742 and 6,124,886, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003 and published Jul. 15, 2004 as International Publication No. WO 2004/058540, and/or PCT Application No. PCT/US04/15424, filed May 18, 2004 and published Dec. 2, 2004 as International Publication No. WO 2004/103772, and/or U.S. patent application Ser. No. 10/510,813, filed Aug. 23, 2002, now U.S. Pat. No. 7,306,276, which are hereby incorporated herein by reference in their entireties.

Optionally, the mirror assembly or accessory module may fixedly or non-movably support one or more other accessories or features, such as one or more electrical or electronic devices or accessories. For example, illumination sources or lights, such as map reading lights or one or more other lights or illumination sources, such as illumination sources of the types disclosed in U.S. Pat. Nos. 6,690,268; 5,938,321; 5,813,745; 5,820,245; 5,673,994; 5,649,756; 5,178,448; 5,671,996; 4,646,210; 4,733,336; 4,807,096; 6,042,253; 6,971,775 and/or 5,669,698, and/or U.S. patent application Ser. No. 10/054,633, filed Jan. 22, 2002, now U.S. Pat. No. 7,195,381; and/or Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference in their entireties, may be included in the mirror assembly. The illumination sources and/or the circuit board may be connected to one or more buttons or inputs for activating and deactivating the illumination sources.

Optionally, the mirror assembly may also or otherwise include other accessories, such as microphones, such as analog microphones or digital microphones or the like, such as microphones of the types disclosed in U.S. Pat. Nos. 6,243,003; 6,278,377 and/or 6,420,975, and/or in U.S. patent application Ser. No. 10/529,715, filed Mar. 30, 2005, now U.S. Pat. No. 7,657,052. Optionally, the mirror assembly may also or otherwise include other accessories, such as a telematics system, speakers, antennas, including global positioning system (GPS) or cellular phone antennas, such as disclosed in U.S. Pat. No. 5,971,552, a communication module, such as disclosed in U.S. Pat. No. 5,798,688, a voice recorder, a blind spot detection and/or indication system, such as disclosed in U.S. Pat. Nos. 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005, now U.S. Pat. No. 7,720,580; and/or PCT Application No. PCT/US2006/026148, filed Jul. 5, 2006 and published Jan. 11, 2007 as International Publication No. WO 2007/005942, transmitters and/or receivers, such as for a garage door opener or a vehicle door unlocking system or the like (such as a remote keyless entry system), a digital network, such as described in U.S. Pat. No. 5,798,575, a hands-free phone attachment, an imaging system or components or circuitry or display thereof, such as an imaging and/or display system of the types described in U.S. Pat. Nos. 6,690,268 and 6,847,487; and/or U.S. provisional applications, Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; and/or Ser. No. 60/628,709, filed Nov. 17, 2004; and/or U.S. patent application Ser. No. 11/105,757, filed Apr. 14, 2005, now U.S. Pat. No. 7,526,103; Ser. No. 11/334,139, filed Jan. 18, 2006, now U.S. Pat. No. 7,400,435; and/or Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, a video device for internal cabin surveillance (such as for sleep detection or driver drowsiness detection or the like) and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and/or 5,877,897, an occupant detection system and/or interior cabin monitoring system (such as the types described in U.S. Pat. Nos. 6,019,411 and/or 6,690,268, and/or PCT Application No. PCT/US2005/042504, filed Nov. 22, 2005 and published Jun. 1, 2006 as International Publication No. WO 2006/058098; and/or PCT Application No. PCT/US94/01954, filed Feb. 25, 1994, a heating element, particularly for an exterior mirror application, such as the types described in U.S. patent application Ser. No. 11/334,139, filed Jan. 18, 2006, now U.S. Pat. No. 7,400,435, a remote keyless entry receiver, a seat occupancy detector, a remote starter control, a yaw sensor, a clock, a carbon monoxide detector, status displays, such as displays that display a status of a door of the vehicle, a transmission selection (4wd/2wd or traction control (TCS) or the like), an antilock braking system, a road condition (that may warn the driver of icy road conditions) and/or the like, a trip computer, a tire pressure monitoring system (TPMS) receiver (such as described in U.S. Pat. Nos. 6,124,647; 6,294,989; 6,445,287; 6,472,979; 6,731,205, and/or U.S. patent application Ser. No. 11/232,324, filed Sep. 21, 2005, now U.S. Pat. No. 7,423,522, and/or an ONSTAR.RTM. system and/or any other accessory or circuitry or the like (with all of the above-referenced U.S. patents and PCT applications and U.S. patent applications and U.S. provisional applications being commonly assigned, and with the disclosures of the referenced U.S. patents and PCT applications and U.S. patent applications and U.S. provisional applications being hereby incorporated herein by reference in their entireties).

Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

* * * * *


Яндекс.Метрика