Выделить слова: 


Патент США №

9939526

Автор(ы)

Jinkins и др.

Дата выдачи

10 апреля 2018 г.


Display system and method using weather radar sensing



РЕФЕРАТ

An enhanced vision method uses or an enhanced vision system includes an onboard weather radar system configured to improve angular resolution and/or resolution in range. The onboard weather radar system generates image data representative of the external scene topography of a runway environment associated with radar returns received by the onboard weather radar system. The radar returns are in an X-band or a C-band. The enhanced vision system also includes a display in communication with the onboard weather radar system and is configured to display an image associated with the image data that is generated by the onboard weather radar system. The enhanced vision system can also be used as an enhanced flight vision system.


Авторы:

Richard D. Jinkins (Rewey, WI), Richard M. Rademaker (Rijswijk, NL), Daniel L. Woodell (Cedar Rapids, IA), Sergey B. Shishlov (Melbourne, IA)

Патентообладатель:

ИмяГородШтатСтранаТип

Richard D. Jinkins
Richard M. Rademaker
Daniel L. Woodell
Sergey B. Shishlov

Rewey
Rijswijk
Cedar Rapids
Melbourne

WI
N/A
IA
IA

US
NL
US
US

Заявитель:

Rockwell Collins, Inc. (Cedar Rapids, IA)

ID семейства патентов

55919514

Номер заявки:

14/536,330

Дата регистрации:

07 ноября 2014 г.

Prior Publication Data

Document IdentifierPublication Date
US 20160131739 A1May 12, 2016

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

G01S 13/953 (20130101); G01S 7/12 (20130101); Y02A 90/18 (20180101)

Класс международной патентной классификации (МПК):

G01S 13/95 (20060101); G01S 7/12 (20060101)

Область поиска:

;342/26B

Использованные источники

[Referenced By]

Патентные документы США

2416155February 1947Chubb
2849184August 1958Arden et al.
2929059March 1960Parker
2930035March 1960Altekruse
2948892August 1960White
2965894December 1960Sweeney
2994966August 1961Senitsky et al.
3031660April 1962Young
3049702August 1962Schreitmueller
3064252November 1962Varela
3070795December 1962Chambers
3071766January 1963Fenn
3072903January 1963Meyer
3089801May 1963Tierney et al.
3107351October 1963Milam
3113310December 1963Standing
3129425April 1964Sanner
3153234October 1964Begeman et al.
3175215March 1965Blasberg et al.
3212088October 1965Alexander et al.
3221328November 1965Walter
3241141March 1966Wall
3274593September 1966Varela et al.
3325807June 1967Burns et al.
3334344August 1967Colby, Jr.
3339199August 1967Pichafroy
3373423March 1968Levy
3397397August 1968Barney
3448450June 1969Alfandari et al.
3618090November 1971Garrison
3680094July 1972Bayle et al.
3716855February 1973Asam
3739380June 1973Burdic et al.
3781878December 1973Kirkpatrick
3810175May 1974Bell
3815132June 1974Case et al.
3816718June 1974Hall et al.
3851758December 1974Makhijani et al.
3866222February 1975Young
3885237May 1975Kirkpatrick
3956749May 1976Magorian
4024537May 1977Hart
4058701November 1977Gruber et al.
4058710November 1977Altmann
4063218December 1977Basov et al.
4235951November 1980Swarovski
4277845July 1981Smith et al.
4405986September 1983Gray
4435707March 1984Clark
4481519November 1984Margerum
4532515July 1985Cantrell et al.
4594676June 1986Breiholz et al.
4595925June 1986Hansen
4598292July 1986Devino
4628318December 1986Alitz
4646244February 1987Bateman et al.
4649388March 1987Atlas
4654665March 1987Kiuchi et al.
4685149August 1987Smith et al.
4760396July 1988Barney et al.
4828382May 1989Vermilion
4843398June 1989Houston et al.
4912477March 1990Lory et al.
4914436April 1990Bateman et al.
4924401May 1990Bice et al.
4939513July 1990Paterson et al.
4951059August 1990Taylor, Jr.
4953972September 1990Zuk
4965573October 1990Gallagher et al.
4987419January 1991Salkeld
5045855September 1991Moreira
5047779September 1991Hager
5047781September 1991Bleakney
5049886September 1991Seitz et al.
5166688November 1992Moreira
5173703December 1992Mangiapane et al.
5175554December 1992Mangiapane et al.
5198819March 1993Susnjara
5202690April 1993Frederick
5247303September 1993Cornelius et al.
5273553December 1993Hoshi et al.
5311183May 1994Mathews et al.
5329391July 1994Miyamoto et al.
5332998July 1994Avignon et al.
5345241September 1994Huddle
5365356November 1994McFadden
5383457January 1995Cohen
5442364August 1995Lee et al.
5539409July 1996Mathews et al.
5559515September 1996Alimena et al.
5559518September 1996Didomizio
5566840October 1996Waldner et al.
5592178January 1997Chang et al.
5678303October 1997Wichmann
5736957April 1998Raney
5820080October 1998Eschenbach
5828332October 1998Frederick
5831570November 1998Ammar
5839080November 1998Muller et al.
5867119February 1999Corrubia et al.
5894286April 1999Morand et al.
5918517July 1999Malapert et al.
5920276July 1999Frederick
5923279July 1999Bamler et al.
5936575August 1999Azzarelli et al.
5942062August 1999Hassall et al.
5945926August 1999Ammar et al.
5950512September 1999Fields
5959762September 1999Bandettini et al.
5978715November 1999Briffe et al.
6002347December 1999Daly et al.
6023240February 2000Sutton
6061016May 2000Lupinski et al.
6061022May 2000Menegozzi et al.
6064942May 2000Johnson et al.
6075484June 2000Daniel et al.
6092009July 2000Glover
6112141August 2000Briffe et al.
6112570September 2000Hruschak
6122570September 2000Muller et al.
6127944October 2000Daly et al.
6128066October 2000Yokozeki
6128553October 2000Gordon et al.
6138060October 2000Conner et al.
6150901November 2000Auken
6154151November 2000McElreath et al.
6154169November 2000Kuntman
6157339December 2000Sato et al.
6157891December 2000Lin
6163021December 2000Mickelson
6166661December 2000Anderson et al.
6169770January 2001Henely
6178391January 2001Anderson et al.
6184816February 2001Zheng et al.
6188330February 2001Glover
6194980February 2001Thon
6199008March 2001Aratow et al.
6201494March 2001Kronfeld
6204806March 2001Hoech
6205400March 2001Lin
6208284March 2001Woodell et al.
6219592April 2001Muller et al.
6233522May 2001Morici
6236351May 2001Conner et al.
6259400July 2001Higgins et al.
6266114July 2001Skarohlid
6278799August 2001Hoffman
6281832August 2001McElreath
6285298September 2001Gordon
6285337September 2001West et al.
6285926September 2001Weiler et al.
6289277September 2001Feyereisen et al.
6311108October 2001Ammar et al.
6317468November 2001Meyer
6317690November 2001Gia
6317872November 2001Gee et al.
6340946January 2002Wolfson et al.
6345127February 2002Mitchell
6359585March 2002Bechman et al.
6366013April 2002Leenders et al.
6373418April 2002Abbey
6374286April 2002Gee et al.
6377202April 2002Kropfli et al.
6377892April 2002Johnson et al.
6388607May 2002Woodell
6388608May 2002Woodell et al.
6388724May 2002Campbell et al.
6389354May 2002Hicks et al.
6401038June 2002Gia
6411890June 2002Zimmerman
6421000July 2002McDowell
6421603July 2002Pratt et al.
6424288July 2002Woodell
6426717July 2002Maloratsky
6426720July 2002Ross et al.
6427122July 2002Lin
6441773August 2002Kelly et al.
6445310September 2002Bateman et al.
6448922September 2002Kelly
6452511September 2002Kelly et al.
6456236September 2002Hauck et al.
6456238September 2002Posey
6462703October 2002Hedrick
6473026October 2002Ali-Mehenni et al.
6473037October 2002Vail et al.
6473240October 2002Dehmlow
6481482November 2002Shimotomai
6492934December 2002Hwang et al.
6501424December 2002Haendel et al.
6512476January 2003Woodell
6512527January 2003Barber et al.
6516272February 2003Lin
6516283February 2003McCall et al.
6520056February 2003Nemeth et al.
6525674February 2003Kelly et al.
6531669March 2003Miller et al.
6549161April 2003Woodell
6567728May 2003Kelly et al.
6574030June 2003Mosier
6577947June 2003Kronfeld et al.
6590528July 2003Dewulf
6591171July 2003Ammar et al.
6593875July 2003Bergin et al.
6600443July 2003Landt
6603425August 2003Woodell
6614057September 2003Silvernail et al.
6650275November 2003Kelly et al.
6650291November 2003West et al.
6653947November 2003Dwyer et al.
6667710December 2003Cornell et al.
6681668January 2004Smirle
6690298February 2004Barber et al.
6690299February 2004Suiter
6690317February 2004Szeto et al.
6697008February 2004Sternowski
6697012February 2004Lodwig et al.
6710663March 2004Berquist
6714186March 2004Mosier et al.
6724344April 2004Stockmaster et al.
6731236May 2004Hager et al.
6738011May 2004Evans
6739929May 2004Furukawa et al.
6741203May 2004Woodell
6741208May 2004West
6744382June 2004Lapis et al.
6744408June 2004Stockmaster
6757624June 2004Hwang et al.
6760155July 2004Murayama et al.
6771626August 2004Golubiewski et al.
6782392August 2004Weinberger et al.
6799095September 2004Owen et al.
6803245October 2004Auch et al.
6804614October 2004McGraw et al.
6806846October 2004West
6807538October 2004Weinberger et al.
6813777November 2004Weinberger et al.
6819983November 2004McGraw
6822617November 2004Mather et al.
6825804November 2004Doty
6832538December 2004Hwang
6839017January 2005Dillman
6842288January 2005Liu et al.
6850185February 2005Woodell
6862323March 2005Loper
6862501March 2005He
6865452March 2005Burdon
6879280April 2005Bull et al.
6879886April 2005Wilkins et al.
6882302April 2005Woodell et al.
6908202June 2005Graf et al.
6917396July 2005Hiraishi et al.
6918134July 2005Sherlock et al.
6933885August 2005Stockmaster et al.
6938258August 2005Weinberger et al.
6950062September 2005Mather et al.
6959057October 2005Tuohino
6972727December 2005West et al.
6977608December 2005Anderson et al.
6984545January 2006Grigg et al.
6990022January 2006Morikawa et al.
6992614January 2006Joyce
6995726February 2006West et al.
6998648February 2006Silvernail
6998908February 2006Sternowski
6999022February 2006Vesel et al.
6999027February 2006Stockmaster
7002546February 2006Stuppi et al.
7010398March 2006Wilkins et al.
7023375April 2006Klausing et al.
7026956April 2006Wenger et al.
7028304April 2006Weinberger et al.
7030945April 2006Umemoto et al.
7034753April 2006Elsallal et al.
7042387May 2006Ridenour et al.
7053796May 2006Barber
7057549June 2006Block
7064680June 2006Reynolds et al.
7069120June 2006Koenck et al.
7089092August 2006Wood et al.
7092645August 2006Sternowski
7098913August 2006Etherington et al.
7109912September 2006Paramore et al.
7109913September 2006Paramore et al.
7123260October 2006Brust
7129885October 2006Woodell et al.
7145501December 2006Manfred et al.
7148816December 2006Carrico
7151507December 2006Herting
7158072January 2007Venkatachalam et al.
7161525January 2007Finley et al.
7170446January 2007West et al.
7170959January 2007Abbey
7180476February 2007Guell et al.
7191406March 2007Barber et al.
7196329March 2007Wood et al.
7205933April 2007Snodgrass
7209070April 2007Gilliland et al.
7212216May 2007He et al.
7218268May 2007Vandenberg
7219011May 2007Barber
7242343July 2007Woodell
7242345July 2007Raestad et al.
7250903July 2007McDowell
7265710September 2007Deagro
7269657September 2007Alexander et al.
7272472September 2007McElreath
7273403September 2007Yokota et al.
7280068October 2007Lee et al.
7289058October 2007Shima
7292178November 2007Woodell et al.
7292180November 2007Schober
7295150November 2007Burlet et al.
7295901November 2007Little et al.
7301496November 2007Honda et al.
7307576December 2007Koenigs
7307577December 2007Kronfeld et al.
7307583December 2007Woodell et al.
7312725December 2007Berson et al.
7312743December 2007Ridenour et al.
7317427January 2008Pauplis et al.
7321332January 2008Focke et al.
7337043February 2008Bull
7349154March 2008Aiura et al.
7352292April 2008Alter et al.
7361240April 2008Kim
7372394May 2008Woodell et al.
7373223May 2008Murphy
7375678May 2008Feyereisen et al.
7379014May 2008Woodell et al.
7379796May 2008Walsdorf et al.
7381110June 2008Sampica et al.
7417578August 2008Woodell et al.
7417579August 2008Woodell
7423578September 2008Tietjen
7446697November 2008Burlet et al.
7446938November 2008Miyatake et al.
7452258November 2008Marzen et al.
7474262January 2009Alland
7479920January 2009Niv
7486220February 2009Kronfeld et al.
7486291February 2009Berson et al.
7492304February 2009Woodell et al.
7492305February 2009Woodell et al.
7515087April 2009Woodell et al.
7515088April 2009Woodell et al.
7525448April 2009Wilson et al.
7528765May 2009Woodell et al.
7528915May 2009Choi et al.
7541970June 2009Godfrey et al.
7541971June 2009Woodell et al.
7551451June 2009Kim et al.
7557735July 2009Woodell et al.
7566254July 2009Sampica et al.
7570177August 2009Reynolds et al.
7576680August 2009Woodell
7603209October 2009Dwyer et al.
7609200October 2009Woodell et al.
7612706November 2009Honda et al.
7616150November 2009Woodell
7633428December 2009McCusker et al.
7633430December 2009Wichgers et al.
7633584December 2009Umemoto et al.
7639175December 2009Woodell
7664601February 2010Daly, Jr.
7675461March 2010McCusker et al.
7693621April 2010Chamas
7696921April 2010Finley et al.
7714767May 2010Kronfeld et al.
7733264June 2010Woodell et al.
7783427August 2010Woodell et al.
7783429August 2010Walden et al.
7791529September 2010Filias et al.
7808422October 2010Woodell et al.
7814676October 2010Sampica et al.
7843380November 2010Woodell
7859448December 2010Woodell et al.
7859449December 2010Woodell et al.
7864103January 2011Weber et al.
7868811January 2011Woodell et al.
7872594January 2011Vesel
7889117February 2011Woodell et al.
7889118February 2011Finley et al.
7927440April 2011Matsuhira
7929086April 2011Toyama et al.
7965223June 2011McCusker
7965225June 2011Dickerson et al.
8035547October 2011Flanigan et al.
8038498October 2011Miyauchi et al.
8045098October 2011Kitagawa et al.
8059025November 2011D'Addio
8068984November 2011Smith et al.
8072368December 2011Woodell
8077078December 2011Woodell et al.
8102487January 2012Kitagawa et al.
8118075February 2012Sampica et al.
8137498March 2012Sampica et al.
8140223March 2012Whitehead et al.
8159464April 2012Gribble et al.
8232917July 2012Scherzinger et al.
8296065October 2012Haynie et al.
8373580February 2013Bunch et al.
8410975April 2013Bell et al.
8477062July 2013Kanellis
8486535July 2013Nemeth et al.
8493241July 2013He
8515600August 2013McCusker
8540002September 2013Sampica et al.
8558731October 2013Woodell
8576112November 2013Garrec et al.
8583315November 2013Whitehead et al.
8594879November 2013Roberge et al.
8603288December 2013Sampica et al.
8634993January 2014McClure et al.
8639416January 2014Jones et al.
8643533February 2014Woodell et al.
8691043April 2014Sampica et al.
8717226May 2014Bon et al.
8773301July 2014Woodell
8896480November 2014Wilson et al.
8909471December 2014Jinkins et al.
8917191December 2014Tiana et al.
8936057January 2015Sampica et al.
9354633May 2016McCusker et al.
2001/0023390September 2001Gia
2001/0050372December 2001Krijn et al.
2001/0053648December 2001Furukawa et al.
2002/0039070April 2002Ververs et al.
2002/0111717August 2002Scherzinger et al.
2002/0116125August 2002Lin
2002/0116126August 2002Lin
2002/0158256October 2002Yamada et al.
2002/0179229December 2002Chuzles
2002/0185600December 2002Kerr
2002/0187284December 2002Kinoshita et al.
2003/0021491January 2003Brust
2003/0038916February 2003Nakano et al.
2003/0043315March 2003Umemoto et al.
2003/0071828April 2003Wilkins et al.
2003/0089214May 2003Fukuta et al.
2003/0093187May 2003Walker
2003/0102999June 2003Bergin et al.
2003/0156238August 2003Hiraishi et al.
2003/0160718August 2003Nagasaku
2003/0174396September 2003Murayama et al.
2003/0180528September 2003Flosenzier et al.
2003/0189606October 2003Moon et al.
2003/0195672October 2003He
2003/0216859November 2003Martell et al.
2003/0222887December 2003Wilkins et al.
2004/0044445March 2004Burdon
2004/0059473March 2004He
2004/0066645April 2004Graf et al.
2004/0072575April 2004Young et al.
2004/0083038April 2004He
2004/0145499July 2004Schmidt et al.
2004/0160341August 2004Feyereisen et al.
2004/0160364August 2004Regev
2004/0181318September 2004Redmond et al.
2004/0264549December 2004Hoole
2005/0004748January 2005Pinto et al.
2005/0052451March 2005Servantie
2005/0126679June 2005Kim
2005/0136625June 2005Henseler et al.
2005/0150289July 2005Osborne
2005/0174350August 2005Ridenour et al.
2005/0200502September 2005Reusser et al.
2005/0225481October 2005Bonthron
2005/0230563October 2005Corcoran, III
2006/0004497January 2006Bull
2006/0097895May 2006Reynolds et al.
2006/0098452May 2006Choi et al.
2006/0164284July 2006Pauplis et al.
2006/0207967September 2006Bocko et al.
2006/0215265September 2006Miyatake et al.
2006/0227012October 2006He
2006/0244636November 2006Rye et al.
2006/0245171November 2006Kim et al.
2006/0290253December 2006Yeo et al.
2006/0290531December 2006Reynolds et al.
2007/0001897January 2007Alland
2007/0002078January 2007He et al.
2007/0008214January 2007Wasiewicz
2007/0013575January 2007Lee et al.
2007/0018887January 2007Feyereisen et al.
2007/0032951February 2007Tanenhaus et al.
2007/0060063March 2007Wright et al.
2007/0146364June 2007Aspen
2007/0171094July 2007Alter et al.
2007/0176794August 2007Feyereisen et al.
2007/0179684August 2007He
2007/0228586October 2007Merrill et al.
2007/0247350October 2007Ryan
2007/0279253December 2007Priest
2007/0297736December 2007Sherman et al.
2008/0018524January 2008Christianson
2008/0051947February 2008Kemp
2008/0074308March 2008Becker et al.
2008/0111731May 2008Hubbard et al.
2008/0145610June 2008Muller et al.
2008/0180351July 2008He
2008/0305721December 2008Ohashi et al.
2009/0021397January 2009Wipf et al.
2009/0040070February 2009Alter et al.
2009/0040772February 2009Laney
2009/0046229February 2009Umemoto et al.
2009/0148682June 2009Higuchi
2009/0152391June 2009McWhirk
2009/0153783June 2009Umemoto
2009/0164067June 2009Whitehead et al.
2009/0207048August 2009He et al.
2009/0279030November 2009Toyama et al.
2009/0279175November 2009Laney et al.
2010/0033499February 2010Gannon et al.
2010/0103353April 2010Yamada
2010/0297406November 2010Schaffer et al.
2010/0312428December 2010Roberge et al.
2010/0312461December 2010Haynie et al.
2011/0037616February 2011Leutelt et al.
2011/0054729March 2011Whitehead et al.
2011/0075070March 2011Kitagawa et al.
2011/0141405June 2011Kitagawa et al.
2011/0165361July 2011Sherman et al.
2011/0184594July 2011Manfred et al.
2011/0273325November 2011Goldman
2011/0282580November 2011Mohan
2011/0304479December 2011Chen et al.
2012/0053831March 2012Halder
2012/0150426June 2012Conway
2012/0174445July 2012Jones et al.
2012/0215410August 2012McClure et al.
2013/0041529February 2013He et al.
2013/0234884September 2013Bunch
2014/0009324January 2014Ranney
2015/0211883July 2015He
2016/0131739May 2016Jinkins et al.

Зарубежные патентные документы

196 49 838Apr 1998DE
19949737Apr 2001DE
0 556 351Jun 1995EP
0 962 752Dec 1999EP
0 814 744Jun 1959GB
1 092 821Nov 1967GB
01-210328Aug 1989JP
05-200880Aug 1993JP
05-293895Nov 1993JP
06-051484Feb 1994JP
H08-220547Aug 1996JP
09-057779Mar 1997JP
10-156853Jun 1998JP
10-244589Sep 1998JP
2000-141388May 2000JP
2004-233590Aug 2004JP
2004-354645Dec 2004JP
2006-218658Aug 2006JP
2006-334912Dec 2006JP
2006-348208Dec 2006JP
2007-206559Aug 2007JP
2007-302398Nov 2007JP
2008-238607Jan 2008JP
WO-93/05634Mar 1993WO
WO-2009/133102Nov 2009WO
WO-2011/089474Jul 2011WO

Другие источники


US. Appl. No. 11/851,323, filed Sep. 6, 2007, McCusker. cited by applicant .
U.S. Appl. No. 11/863,219, filed Sep. 27, 2007, Woodell. cited by applicant .
U.S. Appl. No. 11/863,221, filed Sep. 27, 2007, Woodell. cited by applicant .
U.S. Appl. No. 11/899,801, filed Sep. 6, 2007, Woodell et al. cited by applicant .
U.S. Appl. No. 11/900,002, filed Sep. 6, 2007, Woodell et al. cited by applicant .
U.S. Appl. No. 12/167,200, filed Jul. 2, 2008, Woodell et al. cited by applicant .
U.S. Appl. No. 12/167,203, filed Jul. 2, 2008, Woodell. cited by applicant .
U.S. Appl. No. 12/167,208, filed Jul. 2, 2008, Dickerson et al. cited by applicant .
U.S. Appl. No. 12/180,293, filed Jul. 25, 2008, Woodell et al. cited by applicant .
U.S. Appl. No. 12/786,169, filed May 24, 2010, Nemeth et al. cited by applicant .
U.S. Appl. No. 13/224,992, filed Sep. 2, 2011, Hufnagel et al. cited by applicant .
U.S. Appl. No. 13/250,307, filed Mar. 30, 2011, Jinkins. cited by applicant .
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Jinkins. cited by applicant .
"MountainScope.TM. on a TabletPC," PCAvionics.TM., printed from website www.pcavionics.com on Aug. 28, 2007, 1 page. cited by applicant .
TAWS Class A and Class B, Terrain Awareness and Warning Systems, Universal.RTM. Avionics Systems Corporation, Sep. 2007, 6 pages. cited by applicant .
"TAWS Terrain Awareness and Warning System," Universal.RTM. Avionics, printed from website www.uasc.com on Aug. 28, 2007, 2 pages. cited by applicant .
Adams, Charlotte, "Synthetic Vision: Picturing the Future," Avionics magazine, Oct. 1, 2006, printed from website www.aviationtoday.com, 4 pages. cited by applicant .
Adams, Charlotte, "Synthetic Vision: Picturing the Future," Avionics magazine, Solutions for Global Airspace Electronics, Oct. 2006, cover and pp. 22-29. cited by applicant .
Advisory Action for U.S. Appl. No. 12/009,472, dated Feb. 25, 2013, 3 pages. cited by applicant .
Advisory Action for U.S. Appl. No. 13/538,957, dated Jun. 14, 2013, 6 pages. cited by applicant .
Blue Mountain Avionics' Products, printed from website www.bluemountainavionics.com on Aug. 28, 2007, 4 pages. cited by applicant .
Carter, S. P., D. D. Blankenship, M. E. Peters, D. A. Young, J. W. Holt, and D. L. Morse (2007), Radar-based subglacial lake classification in Antarctica, Geochem. Geophys. Geosyst., 8, 003016, doi:10.1029/2006GC001408, 20 pages. cited by applicant .
Final Office Action on U.S. Appl. No. 13/250,798 dated Sep. 4, 2014, 22 pages. cited by applicant .
Final Office Action on U.S. Appl. No. 13/867,556 dated Jul. 3, 2014, 11 pages. cited by applicant .
Final Office Action on U.S. Appl. No. 13/250,307 dated Jun. 11, 2014, 8 pages. cited by applicant .
Final Office Action on U.S. Appl. No. 13/250,798 dated Aug. 7, 2015, 21 pages. cited by applicant .
G2000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=97668 on Jun. 28, 2011, 2 pages. cited by applicant .
G3000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=66916 on Jun. 28, 2011, 2 pages. cited by applicant .
G5000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=90821&ra=true on Apr. 20, 2011, 2 pages. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 13/250,798 dated Mar. 18, 2015, 21 pages. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 14/301,199 dated Sep. 9, 2015, 18 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/009,372, dated Oct. 13, 2011, 8 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/009,373, dated Jun. 16, 2010, 4 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/009,472, dated Sep. 5, 2013, 8 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/786,169, dated Mar. 28, 2013, 6 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 13/538,957, dated Oct. 3, 2013, 13 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,372, dated Dec. 20, 2010, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,372, dated Jun. 13, 2011, 9 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,373, dated Dec. 30, 2009, 14 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,472, dated Apr. 16, 2012, 16 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,472, dated Jan. 14, 2011, 14 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,472, dated Mar. 20, 2013, 15 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,472, dated Nov. 3, 2011, 15 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/009,472, dated Nov. 9, 2012, 15 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/263,282, dated Jan. 5, 2012, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/786,169, dated Jan. 18, 2013, 14 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/892,563, dated Feb. 19, 2013, 12 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/224,992, dated Feb. 28, 2013, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/250,307, dated Nov. 5, 2013, 11 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/538,957, dated Apr. 4, 2013, 19 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/538,957, dated Oct. 5, 2012, 18 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/743,182, dated Apr. 8, 2013, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/786,169, dated Jul. 20, 2012, 8 pages. cited by applicant .
Office Action in Japanese Patent Application 2015-116688, dated Aug. 25, 2015, 4 pages. cited by applicant .
Office Action in Japanese Patent Application 2015-116716, dated Aug. 25, 2015, 3 pages. cited by applicant .
Office Action on U.S. Appl. No. 12/236,464, dated Feb. 11, 2014, 21 pages. cited by applicant .
Office Action on U.S. Appl. No. 12/236,464, dated Jun. 22, 2011, 14 pages. cited by applicant .
Office Action on U.S. Appl. No. 13/250,798 dated Apr. 23, 2014, 15 pages. cited by applicant .
Office Action on U.S. Appl. No. 13/867,556 dated Feb. 7, 2014, 11 pages. cited by applicant .
Office Action U.S. Appl. No. 11/787,460, dated Aug. 31, 2010, 18 pages. cited by applicant .
Office Action with English Translation received in Korean Patent Application 10-2010-7017278, dated Aug. 26, 2015, 5 pages. cited by applicant .
Pictures of DELPHINS, printed from website www.tunnel-in-the-sky.tudelft.nl on Aug. 28, 2007, 4 pages. cited by applicant .
Restriction Requirement for U.S. Appl. No. 13/867,556, dated Dec. 26, 2013, 6 pages. cited by applicant .
Van Kasteren, Joost, "Tunnel-in-the-Sky, Synthetic vision simplifies the pilot's job and enhances safety," printed from website www.delftoutlook.tudelft.nl on Aug. 28, 2007, 13 pages. cited by applicant .
Walker, GD-Itronix Dynavue Technology, The Ultimate Outdoor-Readable Touch-Screen Display, Rugged PC Review, 4 pages. cited by applicant .
U.S. Appl. No. 12/236,464, filed Sep. 23, 2008, Rockwell Collins. cited by applicant .
U.S. Appl. No. 13/627,788, filed Sep. 26, 2012, Rockwell Collins. cited by applicant .
U.S. Appl. No. 13/857,955, filed Apr. 5, 2013 Barber et al. cited by applicant .
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Rockwell Collins. cited by applicant .
U.S. Appl. No. 14/301,199, filed Jun. 10, 2014, Rockwell Collins. cited by applicant .
U.S. Appl. No. 14/482,681, filed Sep. 10, 2014, Rockwell Collins. cited by applicant .
Airports Authority of India, Chapter 7: Visual AIDS for Navigation--Lights, available prior to Jan. 1, 2005, retrieved from the internet at: http://www.aai.aero/aai_employees/chapter_7.pdf on Sep. 26, 2014, 33 pages. cited by applicant .
Brailovsky et al., REVS122: A Radar-Based Enhanced Vision System for Degraded Visual Environments, Proc. of SPIE vol. 9087 908708-1, retrieved from the internet at http://proceedings.spiedigitallibrary.org on Jun. 25, 2014, 13 pages. cited by applicant .
Federal Aviation Administration, Advisory Circular AC 90-106, "Enhanced Flight Vision Systems", initiated by AFS-400, dated Jun. 2, 2010, 55 pages. cited by applicant .
Federal Aviation Administration, Aeronautical Information Manual (AIM) Basic Flight Information and ATC Procedures, dated Jul. 24, 2014, 2 pages. cited by applicant .
Fountain, J.R., Digital Terrain Systems, Airborne Navigation Systems Workshop (Digest No. 1997/169), IEE Colloquium, pp. 4/1-4/6, Feb. 21, 1997. cited by applicant .
Honeywell, RDR-4B Forward looking windshear detection / weather radar system user's manual with radar operating guidelines, Rev. 6, Jul. 2003, 106 pages. cited by applicant .
Johnson, A., et al., Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain, Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference, pp. 3966-3971, Apr. 18-22, 2005. cited by applicant .
Kuntman, D., Airborne system to address leading cause of injuries in non-fatal airline accidents, ICAO Journal, Mar. 2000, 4 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 11/863,221, dated Aug. 2, 2010, 9 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 11/899,801, dated Aug. 19, 2010, 5 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 11/900,002, dated Sep. 14, 2010, 5 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/167,200, dated Oct. 28, 2010, 5 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/167,203, dated Jun. 21, 2013, 7 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/167,208, dated Mar. 21, 2011, 8 pages. cited by applicant .
Notice of Allowance for U.S. Appl. No. 12/180,293, dated Aug. 4, 2011, 8 pages. cited by applicant .
Notice of Allowance on U.S. Appl. No. 13/241,051 dated Aug. 28, 2014, 9 pages. cited by applicant .
Notice of Allowance on U.S. Appl. No. 13/247,742 dated Jul. 30, 2014, 9 pages. cited by applicant .
Office Action for U.S. Appl. No. 11/851,323, dated Aug. 6, 2009, 23 pages. cited by applicant .
Office Action for U.S. Appl. No. 11/851,323, dated Dec. 15, 2010, 13 pages. cited by applicant .
Office Action for U.S. Appl. No. 11/851,323, dated Jul. 5, 2012, 23 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,200, dated Jul. 21, 2010, 6 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,203, dated Aug. 26, 2010, 11 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,203, dated Sep. 21, 2012, 6 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,208, dated Dec. 30, 2009, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,208, dated Jun. 3, 2010, 11 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/167,208, dated Oct. 19, 2010, 8 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/180,293, dated Jan. 4, 2011, 5 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/180,293, dated Jul. 28, 2010, 8 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/976,871, dated Feb. 15, 2012, 8 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/976,871, dated Jul. 10, 2012, 4 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/976,871, dated May 6, 2013, 5 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/976,871, dated Nov. 21, 2012, 5 pages. cited by applicant .
Office Action for U.S. Appl. No. 12/976,871, dated Oct. 9, 2013, 5 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/183,314, dated Aug. 14, 2013, 11 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/183,314, dated Mar. 28, 2013, 12 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/474,559, dated Aug. 28, 2013, 10 pages. cited by applicant .
Office Action for U.S. Appl. No. 13/474,559, dated Dec. 28, 2012, 8 pages. cited by applicant .
Office Action on U.S. Appl. No. 13/241,051 dated Feb. 27, 2014, 21 pages. cited by applicant .
Office Action on U.S. Appl. No. 13/247,742 dated Dec. 3, 2013, 11 pages. cited by applicant .
REVS Product Information Sheet, Sierra Nevada Corporation, dated May 7, 2014, 2 pages. cited by applicant .
Skolnik, Introduction to Radar Systems, McGraw Hill Book Company, 2001, 3 pages. cited by applicant .
Skolnik, Radar Handbook (McGraw Hill Book Company), 1990, 23 pages. cited by applicant .
Synthetic Vision System, en.wikipedia.org/wiki/Synthetic_vision_system, retrieved Feb. 28, 2013, 4 pages. cited by applicant .
Technical Standard Order, TSO-C115b, Airborne Area Navigation Equipment Using Multi-Sensor Inputs, Department of Transportation, Federal Aviation Administration, Sep. 30, 1994, 11 pages. cited by applicant .
U.S. Office Action on U.S. Appl. No. 11/900,002 dated Jun. 8, 2010. cited by applicant .
U.S. Office Action on U.S. Appl. No. 13/247,742 dated Apr. 16, 2014, 15 pages. cited by applicant .
Vadlamani, A., et al., Improving the detection capability of spatial failure modes using downward-looking sensors in terrain database integrity monitors, Digital Avionics Systems Conference, 2003. DASC-03. The 22nd, vol. 2, pp. 9C.5-91-12 vol. 2, Oct. 12-16, 2003. cited by applicant .
Wang et al., A Simple Based on DSP Antenna Controller of Weather Radar, 2001 CIE International Conference, 4 pages. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 13/250,798 dated Feb. 26, 2016, 9 pages. cited by applicant .
Notice of Allowance on U.S. Appl. No. 12/263,282 dated Jan. 29, 2016, 8 pages. cited by applicant .
Notice of Allowance on U.S. Appl. No. 14/301,199 dated Mar. 1, 2016, 11 pages. cited by applicant .
First Office Action on Korean Patent Application No. 10-2016-7013740, dated Sep. 19, 2016, 7 pages. cited by applicant .
U.S. Appl. No. 14/841,558, filed Aug. 31, 2015, Rockwell Collins, Inc. cited by applicant .
McGray et al., Air Operators, Airlines, Manufacturers and Interested Industry Stakeholders & Aero Chart Forum-Utilizing EFVS technology and incorporating it into FAA NextGen, Federal Aviation Administration, Apr. 23, 2014 & Apr. 30, 2014, 34 pages. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 13/250,798, dated Sep. 9, 2016, 6 pages. cited by applicant .
Notice of Allowance on U.S. Appl. No. 13/250,798, dated Sep. 28, 2016, 10 pages. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 14/482,681, dated Dec. 20, 2016, 9 pages. cited by applicant .
English Translation of Japanese Notice of Reasons for Rejection in Japanese Application No. 2016-001165, dated Apr. 25, 2017, 1 page. cited by applicant .
Non-Final Office Action on U.S. Appl. No. 14/270,587, dated May 8, 2017, 16 pages. cited by applicant .
First Office Action with English Translation of Chinese Application No. 201510005057.5, dated Apr. 25, 2017, 8 pages. cited by applicant.

Главный эксперт: Adams; Tashiana R
Assistant Examiner: Seraydaryan; Helena H
Уполномоченный, доверенный или фирма: Suchy; Donna P. Barbieri; Daniel M.


ФОРМУЛА ИЗОБРЕТЕНИЯ



What is claimed is:

1. An airborne weather radar system used for enhanced vision, comprising: an antenna; and a control circuit configured to provide radar beams via the antenna toward external surroundings and configured to receive radar returns, wherein the control circuit is configured to process the radar returns to provide image data associated with the external surroundings, wherein the weather radar system has increased range resolution and angular resolution when used for enhanced vision of external scene topography than when used for weather radar sensing functions, the control circuit configured to transmit successive pulses with discrete increasing frequency steps such that radar returns from clutter in an ambiguous range are different from radar returns from targets, the control circuit configured to reject the radar returns from clutter in the ambiguous range, the radar beams being in the X-band or the C-band, the image data being for providing a visual image of external scene topography to a pilot.

2. The system of claim 1, wherein the angular resolution is increased using a beam sharpening technique and de-convolution of a beam point spread function.

3. The system of claim 1, wherein the antenna is a switched aperture antenna, and the control circuit is further configured to process the radar returns to determine a high resolution estimate of a target angle relative to a bore site of the antenna based on switching between transmitting and receiving on a full aperture and transmitting on the full aperture while receiving on a partial aperture.

4. The system of claim 1, wherein the control circuit is configured to provide a perspective enhanced image as the visual image on a display in response to the image data.

5. The system of claim 1, wherein the radar beams are provided as separate pulses with discrete increasing frequency steps.

6. The system of claim 5, wherein the frequency steps hop over restricted or undesired frequencies.

7. A method of providing a real time sensor image, the method comprising: receiving radar returns from an X-band or C-band airborne weather radar system, the X-band or C-band airborne weather radar system including a switched aperture antenna, wherein the radar returns can be processed to have increased range resolution and angular resolution when used for providing a real time sensor image of external scene topography than when used for weather radar sensing functions; processing the radar returns to determine a high resolution estimate of a target angle relative to a bore site of the antenna based on switching between transmitting and receiving on a full aperture and transmitting on the full aperture while receiving on a partial aperture; and filtering the radar returns to identify areas having a reflectivity lower than a predetermined value to provide a visual image of the external scene topography based on the radar returns.

8. The method of claim 7, wherein the radar returns of the X-band or C-band airborne weather radar system comprise a switched aperture, sequential lobing or monopulse weather radar system.

9. The method of claim 7, further comprising displaying an aircraft situation display image on a head down display using the visual image.

10. The method of claim 7, further comprising providing radar beams associated with the radar returns, the radar beams are provided using beam sharpening techniques.

11. The method of claim 10, wherein the beam sharpening techniques comprise a sub-aperture or split aperture technique, the method further comprising using a complex conjugate method to remove Doppler-induced phase changes by cancellation to determine the target angle based on a change in phase center.

12. The method of claim 10, wherein the radar beams are provided using ultra wide band pulses, intra pulse compression, or inter pulse compression.

13. The method of claim 12, wherein the radar beams are provided using stepped frequency compression.

14. An enhanced vision system, comprising: a weather radar system configured to enhance resolution in range when used for providing an enhanced image of a runway environment than when used for weather radar sensing functions, wherein the weather radar system generates image data representative of the runway environment associated radar returns received by the weather radar system, the radar returns being in an X-band or a C-band, wherein the weather radar system is configured to transmit successive pulses with discrete increasing frequency steps such that radar returns from clutter in an ambiguous range are different from radar returns from targets, the weather radar system configured to reject the radar returns from clutter in the ambiguous range; and a display in communication with the weather radar system and configured to display an image associated with the radar returns.

15. The system of claim 14, wherein the weather radar system includes processing electronics that are configured to provide increased angular resolution using beam sharpening.

16. The system of claim 14, wherein the display is a head down display or head up display.

17. The system of claim 14, wherein the enhanced vision system is used as an enhanced flight vision system.


ОПИСАНИЕ



CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

The present application is related to U.S. patent application Ser. No. 14/482,681 filed Sep. 10, 2014 by Wood et al., U.S. patent application Ser. No. 14/301,199 filed on Jun. 10, 2014 by McCusker et al, U.S. patent application Ser. No. 13/627,788 filed on Sep. 26, 2012; U.S. patent application Ser. No. 12/892,563 filed on Sep. 28, 2010, U.S. patent application Ser. No. 13/250,798 filed on Sep. 30, 2011, U.S. patent application Ser. No. 12/236,464 filed on Sep. 28, 2008, U.S. patent application Ser. No. 12/167,200 filed on Jul. 2, 2008, U.S. patent application Ser. No. 12/180,293 filed on Jul. 25, 2008, U.S. patent application Ser. No. 13/247,742 filed on Sep. 28, 2011, U.S. patent application Ser. No. 11/851,323 filed on Sep. 6, 2007, U.S. patent application Ser. No. 11/904,491 filed on Sep. 26, 2007, U.S. patent application Ser. No. 13/241,051 filed on Sep. 22, 2011, U.S. patent application Ser. No. 12/263,282 filed on Oct. 31, 2008 and U.S. patent application Ser. No. 12/180,293 filed on Jul. 25, 2008, all of which are herein incorporated by reference in their entireties and assigned to the assignee of the present application.


УРОВЕНЬ ТЕХНИКИ



An aircraft uses an enhanced vision system (EVS) to provide imagery to an aircraft crew. The imagery can include an airport terminal area and runway environment when meteorological conditions prevent a clear natural view of the external surroundings of the aircraft through the windscreen. For example, the EVS may overlay an image of an airport terminal area and runway environment over the pilot's natural unaided view of the external surroundings of the aircraft through the aircraft's cockpit windscreen. Such imagery can improve the situational awareness of the flight crew during instrument approach procedures in low visibility conditions such as fog. That same enhanced vision system can be used as an FAA-certified enhanced flight vision systems (EFVS) which can allow pilots landing under instrument flight rules to operate below certain specified altitudes during instrument approaches even when the airport environment is not visible. For example, under Title 14 of the Code of Federal Regulations, part 91, a pilot may not descend below decision altitude (DA) or minimum descent altitude (MDA) to 100 feet above the touchdown zone elevation (TDZE) from a straight-in instrument approach procedure (IAP), other than Category II or Category III, unless the pilot can see certain required visual references. Such visual references include, for example, the approach lighting system, the threshold lighting system, and the runway edge lighting system. The pilot may, however, use an EFVS to identify the required visual references in low visibility conditions where the pilot's natural unaided vision is unable to identify these visual references. Accordingly, the use of an EFVS may minimize losses due to the inability of the pilot to land the plane and deliver cargo and/or passengers on time in low visibility conditions.

EVS imagery is typically presented to the pilot flying (PF) on a head up display (HUD). The HUD is typically a transparent display device that allows the PF to view EVS imagery while looking at the external surroundings of the aircraft through the cockpit windscreen. As long as visibility conditions outside of the aircraft permit the PF to see the external surroundings of the aircraft through the cockpit windscreen, the PF can verify that the EVS is functioning properly such that the imagery on the HUD is in alignment with the PF's view of the external surroundings of the aircraft.

EVS imagery is sometimes also presented to the pilot monitoring (PM) on a head down display (HDD). For example, in some countries, the system must present the EVS imagery to the PM for confirmation that the EVS information is a reliable and accurate indicator of the required visual references. The PM may also use the EVS imagery to determine whether the PF is taking appropriate action during approach and landing procedures. The HDD is typically a non-transparent display device mounted adjacent to or within a console or instrument panel of the aircraft.

An EVS typically uses either a passive or active sensing system to acquire data used to generate imagery of the airport terminal area and runway environment. A typical passive sensor, such as a forward looking infrared (FLIR) camera or visible light spectrum camera, receives electromagnetic energy from the environment and outputs data that may be used by the system to generate video images from the point of view of the camera. The camera is installed in an appropriate position, such as in the nose of an aircraft, so that the PF may be presented with an appropriately scaled and positioned video image on the HUD having nearly the same point of view as the PF when viewing the external surroundings of the aircraft through the HUD. However, while passive sensors provide higher quality video imagery, they may be unable to identify required visual references in certain low visibility conditions such as heavy fog.

Active sensing systems, such as millimeter wavelength (MMW) radar systems (e.g., 94 GHz), transmit electromagnetic energy into the environment and then receive return electromagnetic energy reflected from the environment. The active sensing system is typically installed in an appropriate position, such as in the nose of an aircraft. Active sensing systems are expensive and require space on-board the aircraft that is required for other types of equipment. In addition, MMW radar systems require expensive radome technology.

Additionally, both passive FLIR cameras and active millimeter wavelength radar systems may have limited range in certain low visibility conditions such as heavy fog.

Thus, there is a need for real time or near real time sensing systems for and methods of providing enhanced vision at longer ranges and in inclement weather. Further, there is a need for real time or near real time sensing systems for and methods of providing enhanced vision imagery that is less expensive and does not require additional space on the aircraft. There is also a need for display systems for and methods of providing images of the external scene topography using radar data from a weather radar system. There is still a further need for systems for and methods of providing images of the runway environment derived from weather radar data where such images enable operation below certain specified altitudes during instrument approaches. Further still, there is a need for systems and methods that achieve higher resolution imaging using X-band and C-band radar data.

It would be desirable to provide a system and/or method that provides one or more of these or other advantageous features. Other features and advantages will be made apparent from the present specification. The teachings disclosed extend to those embodiments which fall within the scope of the appended claims, regardless of whether they accomplish one or more of the aforementioned needs.


СУЩНОСТЬ



In one aspect, embodiments of the inventive concepts disclosed herein are directed to an image processing system for enhanced vision including a processor and memory coupled to the processor. The memory contains program instructions that, when executed, causes the processor to provide radar beams and receive radar returns with improved angular and/or range resolution for deriving image data of the external scene topography.

In a further aspect, embodiments of the inventive concepts disclosed herein are directed to a vision method which uses or a vision system which includes a weather radar system configured to enhance resolution in range and in azimuth. The weather radar system generates image data associated with radar returns received by the weather radar system. The radar returns are in an X-band or a C-band. The vision system also includes a display in communication with the weather radar system configured to display an image associated with the image data.

In a further aspect, embodiments of the inventive concepts disclosed herein are directed to an airborne weather radar system which provides enhanced vision. The weather radar system includes an antenna, and a control circuit configured to provide radar beams via the antenna toward external surroundings and configured to receive radar returns. The control circuit is configured to process the radar returns to provide image data associated with the external surroundings. The weather radar system provides increased range resolution and increased angular resolution compared to weather radar sensing functions for the radar returns used to provide the image data. The radar beams are in the X-band or the C-band, and the image data is for providing a visual image of the external scene topography to a pilot.

In a further aspect, embodiments of the inventive concepts disclosed herein are directed to a method that provides a real time sensor image. The method includes receiving radar returns from an X-band or C-band airborne weather radar system. The radar returns can be processed to have increased range resolution and angular resolution and are received from external surroundings. The method also includes providing a visual image of the external scene topography based on the radar returns.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



Implementations of the inventive concepts disclosed herein may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the annexed drawings, which are not necessarily to scale, and in which some features may be exaggerated and some features may be omitted or may be represented schematically in the interest of clarity. Like reference numerals in the figures may represent and refer to the same or similar element, feature, or function. In the drawings:

FIG. 1 is a schematic illustration of an aircraft control center or cockpit, according to an exemplary embodiment of the inventive concepts disclosed herein;

FIG. 2 is a schematic general block diagram of a display system for displaying an image derived from radar data, according to an exemplary embodiment of the inventive concepts disclosed herein;

FIG. 3 is a flow diagram showing an exemplary process used by the system illustrated in FIG. 2, according to a further exemplary embodiment of the inventive concepts disclosed herein;

FIG. 4 is an illustration of an image derived from radar data provided by the display system illustrated in FIG. 2 according to an exemplary embodiment of the inventive concepts disclosed herein; and

FIG. 5 is an illustration of an image derived from radar data and merged with HUD symbology provided by the display system illustrated in FIG. 2 according to yet another exemplary embodiment of the inventive concepts disclosed herein.


ПОДРОБНОЕ ОПИСАНИЕ



Before describing in detail the inventive concepts disclosed herein, it should be observed that the inventive concepts disclosed herein include, but are not limited to, a novel structural combination of data/signal processing components and communications circuits, and not in the particular detailed configurations thereof. Accordingly, the structure, methods, functions, control and arrangement of components, software, and circuits have, for the most part, been illustrated in the drawings by readily understandable block representations and schematic diagrams, in order not to obscure the disclosure with structural details which will be readily apparent to those skilled in the art, having the benefit of the description herein. Further, the inventive concepts disclosed herein are not limited to the particular embodiments depicted in the exemplary diagrams, but should be construed in accordance with the language in the claims.

According to various exemplary embodiments, an EVS or display system may be provided with radar sensing and imagery displayable to a pilot or co-pilot on an aircraft display, such as an HDD or HUD. For example, the display system may include or use a weather radar system to display an image based upon radar return data. In some embodiments, a Doppler weather radar system may be configured to have enhanced resolution (e.g., angular resolution and/or range resolution). Reflectivity of radar returns from runway structures in an airport terminal or runway environment, such as, an approach lighting system, a threshold lighting system, and or a runway edge lighting system, can be sensed. As will be appreciated, using a weather radar system configured according to the various exemplary embodiments provides greater range than millimeter wavelength radar sensing systems or passive FLIR or visible light camera systems in low visibility conditions, such as, heavy fog, given the weather radar system's superior ability of weather radar system to penetrate heavy fog.

Using the weather radar system configured according to the various exemplary embodiments may also provide EVS imagery having sufficient accuracy in low visibility conditions (given that many of the visual references required under Title 14 of the Code of Federal Regulations, part 91, such as, approach lighting systems, threshold lighting systems, runway edge lighting systems, and other runway structures, are metallic structures that exhibit high radar reflectivity). The imagery may allow lower landing minima (e.g., 100 feet or less) in some embodiments. In some embodiments, the lack of radar returns from the runway surface combined with runway structures and lights can provide a suitable image for runway identification by the pilot.

The display system includes a radar processing module in communication with the radar system and configured to generate high resolution radar image data for display in some embodiments. The image data is processed to provide a two-dimensional aircraft situation display (e.g., vertical profile display or plan view display) or three dimensional or perspective aircraft situation display image representative of the 3-D positions of runway structures in an airport terminal or runway environment based on the radar returns as described in U.S. patent application Ser. Nos. 14/301,199 and 14/482,681 incorporated herein by reference in their entireties in some embodiments. For example, the radar processing module can be embodied as a processor and a non-transitory memory containing program instructions that, when executed, cause the processor to provide radar beams and receive radar returns and generate image data from the radar returns. In some embodiments, program instructions stored on the non-transitory medium can cause the processor to filter the radar returns data to remove noise.

According to certain exemplary embodiments, a radar system such as a weather radar system, can be used to detect features of a runway environment. Utilizing the high radar cross section associated with metal content of runway lighting advantageously allows detection to be achieved whether at day or night, regardless of whether runway lights are on or are off in some embodiments. In one embodiment, the regular, periodic, equal spacing nature of visual aids, such as, approach lighting system, runway edge lights, taxi way lights, and center line lights, can be identified from the image generated from the radar data. In certain embodiments, the systems and methods can be utilized as extension to a combined vision system (CVS).

Referring to FIG. 1, a display system 10 is provided in an aircraft having an aircraft control center 11 or cockpit. The aircraft control center 11 includes flight displays 20 embodied as head down displays (HDDs). The aircraft control center 11 can also include a combiner 21 association with a head up display (HUD) system. In some embodiments, the combiner 21 is provided as part of a wearable HUD. Conformal images are provided on the combiner 21 in some embodiments.

The flight displays 20 and the combiner 21 can be used to provide information to the flight crew, thereby increasing visual range and enhancing decision-making abilities. In an exemplary embodiment, the flight displays 20 and the combiner 21 can include a weather display, a joint display, a weather radar map and a terrain display. Further, the flight displays 20 may include images from a synthetic vision system (SVS) or an enhanced vision system (EVS) (e.g., an EFVS). For example, the flight displays 20 can include a display configured to display a perspective image of terrain and/or weather information. Other views of terrain and/or weather information may also be provided (e.g., plan view, horizontal view, vertical view, or combinations thereof). Additionally, the flight displays 20 can be implemented using any of a variety of display technologies, including CRT, LCD, organic LED, dot matrix display, and others.

According to some embodiments, the display system 10 is configured to provide an image based upon radar data to at least one of the displays 20 or the combiner 21. In FIG. 1, the image on the combiner 21 includes a runway 23 or features 29 associated with the runway 23 as viewed from the aircraft (e.g., during approach and/or landing). In some embodiments, at least one of the displays 20 or the combiner 21 displays a merged image of terrain derived from two or more of enhanced vision data, radar data, and SVS data. Advantageously, real time radar data can be provided to provide a real time, all weather detection of the runway features 29 associated with the runway 23 in one embodiment. Advantageously, the radar data allows the runway 23 and its orientation to be viewed by one or more pilots in challenging weather conditions in some embodiments.

In some embodiments, a symbol or icon for the runway 23 and extended centerline 27 can be provided on the displays 20 or the combiner 21. In some embodiments, the runway 23 and extended centerline 27 can be associated with SVS data. A set of runway features 29, such as, approach lighting system or other runway and taxi way lights, can be indicated on the displays 20 or the combiner 21 in some embodiments. The runway features 29 can be associated with radar data in some embodiments.

Referring to FIG. 2, the display system 10 can be utilized for providing an image to any of the displays 20 or the combiner 21. The display system 10 is in communication with or includes a radar system 102, a synthetic vision system (SVS) 111 and an enhanced vision system (EVS) 112. The EVS 112 and the SVS 111 are optional in some embodiments. The display system 10 can include an HDD computer 132 and a HUD computer 134. The display system 10 includes a memory 153 for storing enhanced vision frame from the EVS 112, a memory 152 for storing enhanced vision frame from the SVS 111, a filter 154, an image renderer 155, a memory 156 for storing the radar image from the image renderer 155, an image merge function module 160, and an image merge control/configuration module 162.

The filter 154, the image renderer 155, the image merge module 160, and the image merge control/configuration module 162 can be embodied as software modules operating on a computing platform or a processor 175 and can be stored on a non-transitory medium. The processor 175 can be part of or integrated with the radar system 102, the SVS 111, the EVS 112, HDD display computer 132, or HUD computer 134 in certain embodiments. In one embodiment, processor 175 is an independent platform.

The radar system 102 is a weather radar system generally located inside the nose of the aircraft, inside a cockpit of the aircraft, on the top of the aircraft or on the tail of the aircraft in some embodiments. The radar system 102 can include a radar data storage unit 180, a radar antenna 182 and a processor 185. The radar system 102 can be a weather radar system, such as, a Multiscan.TM. radar system from Rockwell Collins, Inc. configured as described herein. The radar system 102 can utilize a split, half or sub-aperture or other technique for obtaining radar data associated with external surroundings in some embodiments. The radar system 102 can use the split or sub-aperture techniques of the radar systems described in U.S. application Ser. Nos. 13/627,788, 12/892,563, 13/250,798, 12/236,464, and 12/167,200 and U.S. Pat. No. 8,077,078, incorporated herein by reference and assigned to the assignee of the present application. The type of the radar system 102 and data gathering techniques are not discussed in the specification in a limiting fashion.

The processor 185 receives radar returns (e.g., weather radar returns data) from the radar antenna 182, processes the radar returns and provides the radar data in radar data storage unit 180. In certain embodiments, the data stored in radar data storage unit 180 can be stored as an image frame representing the data from a radar scan of the external surroundings (e.g., a runway environment).

The radar system 102 provides radar data (e.g., weather radar data) in the storage unit 180 to a filter 154 in one embodiment. In one embodiment, the image renderer 155 or other image generator can generate an image frame from the data stored in radar data storage unit 180 or filtered by the filter 154 and provides this to memory 156. Alternatively, the processor 185 can build the frame or image based upon radar return data from the radar system 102. Similarly, the SVS 111 can provide data or a frame for SVS image received by a memory 152. Alternatively, the display system 10 can provide the data or image frame to the memory 152 in response to data from the SVS 111. Similarly, the EVS 112 can provide data or a frame for EVS image received by a memory 153. Alternatively, the display system 10 can provide the data or image frame to the memory 153 in response to data from the EVS 112.

The radar data associated with the external surroundings can represent detected targets and the location of the detected targets. Targets include terrain, man-made features, objects, runways, etc. Improved angular resolution and range resolution techniques allow the location of the targets to be more accurately determined and represented in image data in some embodiments. The radar system 102 scans the external surroundings in front of the aircraft to sense the location of targets. The radar system 102 can utilize clutter suppression and Doppler filtering to improve performance in some embodiments.

In some embodiments, the radar system 102 provides data representing a 120 degree field of view in accordance with a weather radar sweep which takes approximately five seconds to complete in one embodiment. The sweep can be limited during approach to be a 30 degree sweep which requires five seconds before new data is available for display in certain embodiments. The sweep is directed toward the surface of the Earth so that returns are obtained which allow runway environment features to be detected. Various types of sweeps, scans and timings of sweeps and scans can be utilized without departing from the scope of the invention.

The radar system 102 embodied as a weather radar allows existing avionic equipment to be used as a real-time sensor for providing a radar-derived enhanced image of the external scene topography to the pilot in some embodiments. The image or representation generated by the radar system 102 is provided on the displays 20 or the combiner 21 can function as an EVS to provide situation awareness to the pilot in some embodiments. In other embodiments, the image or representation generated by the radar system 102 is provided on the displays 20 or the combiner 21 can function as an EFVS to allow lower landing minima.

The radar system 102 includes a range resolution module 190 and an angle resolution module 192 in some embodiments. The range resolution module 190 advantageously increases the range resolution of the radar system 102 when compared to conventional weather sensing operations in some embodiments. The angle resolution module 190 advantageously increases the angle resolution of the radar system 102 when compared to conventional weather sensing operations in some embodiments. The increased resolution in range and angle allows a higher resolution image to be provided on the displays 20 and the combiner 21 in some embodiments. The range resolution module 190 and the angle resolution module 192 can be software modules executed by the processor 185.

According to some embodiments, the radar system 102 under control of the angle resolution module 192 can use a beam sharpening method to achieve increased angular resolution. In some embodiments, the radar system 102 can utilize techniques such as beam sharpening (e.g., horizontal beam sharpening) and de-convolution of the beam point spread function for improved angular resolution. In some embodiments, the radar system 102 can use beam sharpening as a process that improves the antenna-induced poor angular resolution (e.g., due to the beam width). There are many methods that can be used such as: Doppler Beam Sharpening, Synthetic Aperture Radar (SAR), Monopulse Radar, Sub-Aperture Radar or Split-Aperture Radar, etc. Mathematical methods can be utilized to determine a center of the radar echo for identifying runway features. Techniques for beam sharpening are discussed in U.S. patent application Ser. Nos. 13/627,788, 12/892,563, 13/250,798, 12/236,464, and 12/167,200 and U.S. Pat. No. 8,077,078 incorporated herein by reference in their entireties.

The radar system 102 can use the radar antenna 182 configured as a switched aperture antenna for beam sharpening. The radar system 102 can also be configured for sequential lobing or monopulse operation to accurately estimate at which angle the target was located within the radar beam. In some embodiments, the radar beams provided by the radar antenna 182 and returns received by the radar antenna 182 associated with the radar system 102 can be separated into two or more portions and can be used to determine an angle from the radar antenna 182 to a target or a vector from the radar antenna 182 to a target such as a runway feature. The vector can be represented as an angle (bore site angle) and range to the target. Various processes can be utilized to calculate the angle or vector to the target.

The radar system 102 uses the radar antenna 182 that toggles between transmitting and receiving on the full aperture and transmitting on the full aperture while receiving on the partial aperture in some embodiments. These techniques can be used to accurately estimate at which angle the target was located within the radar beam and can be used to improve the accuracy of the Doppler calculations correcting for those angles. The received returns can be processed to determine a high resolution estimate of a target angle relative to the bore site of the antenna beam. According to some embodiments, the returns can be processed using a complex conjugate multiplication method to determine the target angle. The processing can be related to sequential lobing processing but is executed in the phase domain as opposed to the common amplitude domain in some embodiments.

In some embodiments, the radar system 102 uses sequential lobing techniques where two antennas that are close to the same place may be used, going back and forth between the two antennas. An amplitude signature or phase signature that varies between the two halves of the antennas may be used to obtain data about target position for detected targets (e.g., an object such as other aircraft, terrain, towers). Sequential lobing generally does not use phase comparisons with moving targets due to Doppler-induced phase changes that contaminate the phase center measurement. However, using a complex conjugate multiply method allows the Doppler-induced phase changes to be removed by cancellation. Therefore, a change in phase center between multiple different sub-apertures may be determined and used to determine angle to target.

In some embodiments, the range resolution module 190 provides higher resolution by increasing the effective waveform bandwidth of the radar system 102. The range resolution module 190 can use stepped-frequency compression in some embodiments. To provide higher range resolution, the range resolution module 192 can control the radar system 102 to provide ultra-wideband radar (UWB) beams (e.g., extremely narrow pulses with high power), or to provide intra pulse compression (frequency of phase modulation of the transmitted pulse) in some embodiments. Frequency coding techniques including the common linear frequency modulation (LFM) or chirp method, and discrete coded segments within the pulse can be utilized in some embodiments. Phase coding techniques including binary phase codes as well as various polyphase codes can be utilized in some embodiments. To provide higher range resolution, the range resolution module 192 can control the radar system 102 to provide interpulse pulse compression or stepped frequency compression (e.g., successive pulses with discrete increasing frequency steps) in some embodiments. In some embodiments, stepped frequency compression advantageously achieves high effective bandwidth with narrow instantaneous bandwidth. The receive bandwidth is smaller, has lower noise bandwidth, and a higher signal to noise ratio in some embodiments. Analog-to-digital sampling rates are lower (vs. pulse-compression) in some embodiments. In addition, the stepped frequency compression also has a smaller peak power (e.g., when compared to impulse), provides flexible transmit frequency control, can "hop" over restricted or undesired transmit frequencies, enables adaptive/cognitive frequency use, and rejects later received clutter from earlier transmit pulses in some embodiments. Further, the stepped frequency compression provides returns from clutter in ambiguous ranges that have frequencies that are different from returns from targets and rejects ambiguous clutter returns in the receiver IF filter of the radar system 102 in some embodiments. Stepped frequency compression generally does not achieve range resolution with a single pulse, requires transmit, receive and processing of a group of pulses for any one bin, and has more pronounced range-Doppler coupling (e.g., different Doppler shifts for each frequency) in some embodiments.

According to one embodiment, the SVS 111 can be any electronic system or device for providing a computer generated image of the external scene topography. The image can be from the perspective of the aircraft flight deck as derived from aircraft attitude, high-precision navigation solutions, and a database of terrain, obstacles and runway features. Generally, only those terrain, obstacle, and runway features which are contained in the current version of the SVS database are displayed in a conventional system. In some embodiments, the pilot uses the synthetic vision images as enhancements to available visual cues.

According to one embodiment, the EVS 112 can be any electronic system or device for providing a sensed image of the external scene topography. The EVS 112 can be an infrared camera in one embodiment.

In some embodiments, the display system 10 combines or fuses images from the HUD computer 134, the SVS 111 and/or the EVS 112 with the image derived from radar data from the radar system 102 to provide an overall image provided to the pilot according to one embodiment. In some embodiment, the image derived from the radar data are fused with HUD symbology for the displays 20 or the combiner 21.

The SVS 111 can include a terrain database and a processor according to one exemplary embodiment. The terrain database can be used to create a perspective image of the scene in front of the aircraft on a two-dimensional display or a three dimensional display. The terrain database can employ topographical colors similar to those depicted on standard aeronautical charts.

The SVS 111 can also receive aircraft position data from an aircraft data source. The aircraft data source can include any system or sensor (or combination thereof) that provides navigation data or aircraft flight parameters. For example, a typical navigation system in an aircraft has numerous sub-systems. Sub-systems which provide aircraft position data and flight parameter data could include, but are not limited to, an inertial navigation system (INS), a global navigation satellite system (e.g., global positioning system (GPS)), air data sensors, compasses, and a flight management computer (FMC).

In some embodiments, the filter 154 processes the radar data for better image quality. The filter 154 can be located in the radar system 102. The filter 154 can reduce noise and employ anti-speckling filtering, Kalman filtering, Chebyshev filtering, adaptive filtering, smoothing, etc. The filter 154 can also perform anti-aliasing in some embodiments. Techniques for increasing image quality and identifying runway features are discussed in U.S. patent application Ser. No. 14/482,681 and incorporated herein by reference.

In order to facilitate generation of clearer images, the processor 185 and/or the filter 154 may be configured to filter the radar returns data to identify areas having a reflectivity lower than a predetermined value. In some embodiments, low energy areas may be zeroed out based on their corresponding reflectivity values, such that the area will be rendered transparent. Such filtering may result in a final image with only highly reflective structures in an airport terminal area or runway environment, such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system.

In some embodiments, the radar data from the radar data storage unit 180 is provided to filter 154, the image renderer 155, and the provided as image data to memory 156 and to the HUD computer 134 or the HDD display computer 132 for providing images on the displays 20 or the combiner 21. In another embodiment, the radar data can be provided as image data to an image merge function module 160. The image merge function module 160 receives an EVS frame from the memory 153 or an SVS frame from the memory 152 and merges the data to appropriately display an EVS image or an SVS image with the image derived from the radar data.

The processor 175 executes a fusion processing algorithm for fusing the frames from the memory 152, the memory 153, and the memory 156 provided as video signals in some embodiments. This fusion process may include special formatting (positioning, sizing, cropping, etc.) of specific features or the entire image from a specific image source based on other sensor inputs or aircraft. After the combined or fused image has been completed, the entire image is sized to fit appropriately within the total HUD field-of-view (e.g., with HUD symbology) and conformally overlay the outside scene, which is viewed through the combiner 21 of the HUD. In addition, the overall fused image contrast is standardized with the brightness/contrast to support the brightness/contrast controls of the HUD.

The processors 175 and 185 can be any hardware and/or software processor or processing architecture capable of executing instructions and operating on navigational and radar data. The processors 175 and 185 can be capable of determining navigational information such as altitude, heading, bearing, and location based on data from aircraft sensors. Applicants note that flow 300 can be performed in various equipment on the aircraft including in the HUD computer 134, a display processor, the weather radar system 102, a navigation system, the SVS 111, etc. in accordance with an exemplary embodiment. The processors 175 and 185 may be, or may include, one or more microprocessors, an application specific integrated circuit (ASIC), a circuit containing one or more processing components, a group of distributed processing components, circuitry for supporting a microprocessor, or other hardware configured for processing.

Image merge control configuration module 162 can provide format adjustments to data. The SVS 111 and the radar system 102 can have their own specific interface type and format. Also, each display of the displays 20 and the combiner 21 may require specific formatting. A standard format can be a format used in HUD processing functions. The image control configuration module 138 can be implemented in hardware, software, or combinations thereof.

Real time images derived from radar data allow the pilot exact and very reliable confirmation of the presence of a runway in some embodiment. In one embodiment, localization of the pattern of runway environment features, such as the runway approach lights or the runway edge lights allows easy recognition of the location of the runway with respect to the aircraft. In some embodiments the image data can be processed to provide a two-dimensional aircraft situation display (e.g., vertical profile display or plan view display). In other embodiments the image data can be processed to provide a three dimensional or perspective aircraft situation display image representative of the 3-D positions of runway environment features.

With reference to FIG. 3, a flow 300 can be performed by the display system 10 in some embodiments. At an operation 302, the weather radar system 102 provides a weather radar scan comprised of multiple radar beams. The radar beams are provided according to beam sharpening techniques and stepped frequency compression techniques in a way to increase the angular and range resolution in some embodiments. At an operation 304, radar returns are received according to the beam sharpening techniques and the stepped frequency compression techniques to increase the angular and range resolution in some embodiments. At an operation 306, the radar returns are processed to obtain image data in some embodiments. As discussed above, filtering or related techniques can be performed in an optional step by the filter 154. At an operation 308, the image data is merged with other image data, such as overlay symbology or sources of SVS or other EVS images. Operation 308 is optional.

At an operation 310, the image associated with the image data is displayed on a display via a display computer such as the HUD display computer 132 or the HUD computer 134. After operation 310, flow 300 returns to operation 302 in some embodiments.

With reference to FIG. 4, an image 400 of the external scene topography derived from radar associated with storage unit 180 includes features 402 associated with runway approach lights.

With reference to FIG. 5, an image 600 derived from the radar data is provided with HUD symbols 602. The HUD symbols 602 are shown in static format as a representative example only.

The radar system 102 generally operates by sweeping a radar beam horizontally and/or vertically along the sky for weather detection. For example, radar system 102 may conduct a first horizontal sweep directly in front of the aircraft and a second horizontal sweep downward at some tilt angle (e.g., 20 degrees down). Returns from different tilt angles may be electronically merged to form a composite image for display on an electronic display, such as the displays 20 and the combiner 21 in the aircraft control center 11. Sensing of the external surroundings can be performed at higher resolutions than the weather sensing and use one or more beams directed toward the external surroundings. Sensing of the external surroundings can be performed in a more forward looking direction with smaller azimuthal sweeps than are used for weather detection in some embodiments. GPS and/or other navigation information can be used to point the radar beam toward the external surroundings associated with an airport in some embodiments.

In some embodiments, the weather radar system 102 may operate in a weather sense mode until approach or landing. During approach or landing, the weather radar system 102 alternatively performs radar data gathering for sensing of the external surroundings, radar data gathering for weather sensing, and radar data gathering for wind shear detection. In some embodiments, during approach or landing, the weather radar system 102 alternatively performs radar data gathering for sensing of external surroundings, and radar data gathering for wind shear detection or other hazard detection. During approach or landing, weather radar system 102 alternatively performs radar data gathering for sensing of external surroundings, and radar data gathering for weather sensing in some embodiments. In some embodiments, weather sensing operations are suspended during approach and landing.

The scope of this disclosure should be determined by the claims, their legal equivalents and the fact that it fully encompasses other embodiments which may become apparent to those skilled in the art. All structural, electrical and functional equivalents to the elements of the above-described disclosure that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. A reference to an element in the singular is not intended to mean one and only one, unless explicitly so stated, but rather it should be construed to mean at least one. No claim element herein is to be construed under the provisions of 35 U.S.C. .sctn.112, sixth paragraph, unless the element is expressly recited using the phrase "means for." Furthermore, no element, component or method step in the present disclosure is intended to be dedicated to the public, regardless of whether the element, component or method step is explicitly recited in the claims.

Embodiments of the inventive concepts disclosed herein have been described with reference to drawings. The drawings illustrate certain details of specific embodiments that implement the systems and methods and programs of the present disclosure. However, describing the embodiments with drawings should not be construed as imposing any limitations that may be present in the drawings. The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing its operations. Embodiments of the inventive concepts disclosed herein may be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired system.

As noted above, embodiments within the scope of the inventive concepts disclosed herein include program products comprising non-transitory machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media may be any available media that may be accessed by a computer or other machine with a processor. By way of example, such machine-readable media may comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to carry or store desired program code in the form of machine-executable instructions or data structures and which may be accessed by a computer or other machine with a processor. Thus, any such a connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause processor to perform a certain function or group of functions.

Embodiments in the inventive concepts disclosed herein have been described in the general context of method steps which may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.

As previously indicated, embodiments in the present disclosure may be practiced in a networked environment using logical connections to one or more remote computers having processors. Those skilled in the art will appreciate that such network computing environments may encompass many types of computers, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and so on. Embodiments in the disclosure may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.

The database or system memory may include read only memory (ROM) and random access memory (RAM). The database may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media. The drives and their associated machine-readable media provide nonvolatile storage of machine-executable instructions, data structures, program modules and other data for the computer. User interfaces, as described herein, may include a computer with monitor, keyboard, a keypad, a mouse, joystick or other input devices performing a similar function.

It should be noted that although the diagrams herein may show a specific order and composition of method steps, it is understood that the order of these steps may differ from what is depicted. For example, two or more steps may be performed concurrently or with partial concurrence. Also, some method steps that are performed as discrete steps may be combined, steps being performed as a combined step may be separated into discrete steps, the sequence of certain processes may be reversed or otherwise varied, and the nature or number of discrete processes may be altered or varied. The order or sequence of any element or apparatus may be varied or substituted according to alternative embodiments. Accordingly, all such modifications are intended to be included within the scope of the present disclosure.

The foregoing description of embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the subject matter disclosed herein. The embodiments were chosen and described in order to explain the principals of the disclosed subject matter and its practical application to enable one skilled in the art to utilize the disclosed subject matter in various embodiments and with various modifications as are suited to the particular use contemplated. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the embodiments without departing from the scope of the presently disclosed subject matter.

While the exemplary embodiments illustrated in the figures and described above are presently preferred, it should be understood that these embodiments are offered by way of example only. Other embodiments may include, for example, structures with different data mapping or different data. The disclosed subject matter is not limited to a particular embodiment, but extends to various modifications, combinations, and permutations that nevertheless fall within the scope and spirit of the appended claims.

* * * * *


Яндекс.Метрика