Выделить слова: 


Патент США №

10867282

Автор(ы)

Glunz

Дата выдачи

15 декабря 2020 г.


Method and system for GPS enabled model and site interaction and collaboration for BIM and other design platforms



РЕФЕРАТ

A method and system for GPS enabled model and site interaction for Building Information Modeling (BIM) and other design platforms. Collaboration information for actual physical objects at physical locations is automatically collected and associated with virtual objects in virtual object models in a three-dimensional (3D) object modeling programs for a selected project or new virtual objects that did not previously exist are created in the 3D modeling program and associated with the actual physical objects that have been physically added to at a project site. The method and system allows two-way real-time and static collaboration between native and new composite XD (e.g., 3D, or lower or higher dimensional) object models from within existing 3D modeling BIM programs (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, etc.) and the actual physical objects at the actual physical locations.


Авторы:

Benjamin F. Glunz (Elgin, IL)

Патентообладатель:

ИмяГородШтатСтранаТип

Benjamin F. Glunz

Elgin

IL

US

Заявитель:

Anguleris Technologies, LLC (Elgin, IL)

ID семейства патентов

58664098

Номер заявки:

14/934,478

Дата регистрации:

06 ноября 2015 г.

Prior Publication Data

Document IdentifierPublication Date
US 20170132568 A1May 11, 2017

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

G06Q 10/101 (20130101); H04L 67/12 (20130101); H04L 67/10 (20130101); G06T 19/00 (20130101); G06F 30/13 (20200101); G06F 2111/02 (20200101); G06T 2219/024 (20130101); H04L 65/403 (20130101)

Класс международной патентной классификации (МПК):

G06Q 10/10 (20120101); G06T 19/00 (20110101); H04L 29/08 (20060101); G06F 30/13 (20200101)

Использованные источники

[Referenced By]

Патентные документы США

7295955November 2007Sit
7949690May 2011McArdle et al.
8427473April 2013Elsberg et al.
8463765June 2013Lesavich
8484231July 2013Li et al.
8558658October 2013Kumar et al.
8606554December 2013Zimmermann et al.
8793790July 2014Khurana et al.
9037564May 2015Lesavich et al.
9137250September 2015Lesavich et al.
9222771December 2015Rosengaus
9361479June 2016Lesavich et al.
9400902July 2016Schoner
9569771February 2017Lesavich et al.
9782936October 2017Glunz et al.
9817922November 2017Glunz et al.
2003/0117405June 2003Hubrecht
2005/0081161April 2005Macinnes et al.
2006/0044307March 2006Song
2006/0136179June 2006Sit
2008/0015823January 2008Arnold
2008/0059220March 2008Roth et al.
2008/0249756October 2008Chaisuparasmikul
2008/0281573November 2008Seletsky et al.
2009/0125283May 2009Conover
2009/0288012November 2009Hertel
2009/0292509November 2009Thompson et al.
2010/0070241March 2010Opdahl et al.
2010/0110071May 2010Elsberg et al.
2010/0223032September 2010Reghetti
2010/0280836November 2010Lu et al.
2011/0008754January 2011Bassett
2011/0050687March 2011Alyshev
2011/0054652March 2011Heil
2011/0071805March 2011Pendyala et al.
2011/0087350April 2011Fogel
2011/0093424April 2011Zimmermann et al.
2011/0133884June 2011Kumar et al.
2011/0166824July 2011Haisty
2011/0178831July 2011Ravichandran
2011/0208710August 2011Lesavich et al.
2011/0218777September 2011Chen et al.
2011/0246329October 2011Geisner
2011/0285851November 2011Plocher et al.
2011/0307281December 2011Creveling et al.
2012/0069131March 2012Abelow
2012/0105581May 2012Berestov
2012/0154389June 2012Bohan
2012/0167122June 2012Koskimies
2012/0169710July 2012Senescu
2012/0203806August 2012Panushev
2012/0215500August 2012Ciuti et al.
2012/0265707October 2012Bushnell
2012/0278622November 2012Lesavich et al.
2012/0284596November 2012Bushnell et al.
2012/0296609November 2012Khan et al.
2012/0296610November 2012Hailemariam et al.
2012/0310906December 2012Miller et al.
2013/0013265January 2013Narayan et al.
2013/0082101April 2013Omansky et al.
2013/0091539April 2013Khurana et al.
2013/0125029May 2013Trimbl
2013/0132466May 2013Miller
2013/0144566June 2013De Biswas
2013/0144746June 2013Phung
2013/0155058June 2013Golparvar-Fard
2013/0179207July 2013Perez Rodriguez
2013/0182103July 2013Lee et al.
2013/0185024July 2013Mahasenan et al.
2013/0235029September 2013Keough et al.
2013/0238125September 2013Suzuki
2013/0257850October 2013Chen et al.
2013/0268357October 2013Heath
2013/0278631October 2013Border
2013/0303193November 2013Pallavi
2013/0307682November 2013Jerhotova et al.
2013/0314210November 2013Schoner et al.
2013/0314232November 2013Jerhotova et al.
2013/0315437November 2013Kerschner
2013/0335413December 2013Wang et al.
2014/0013262January 2014Milostic
2014/0035726February 2014Schoner
2014/0039845February 2014Saban et al.
2014/0051037February 2014Fisker
2014/0089209March 2014Akcamete et al.
2014/0095122April 2014Appleman
2014/0189792July 2014Lesavich et al.
2014/0192159July 2014Chen et al.
2014/0195420July 2014Trickel
2014/0207774July 2014Walter et al.
2014/0210856July 2014Finn
2014/0214215July 2014Han et al.
2014/0304107October 2014Clarke
2015/0142152May 2015Rezayat
2015/0248502September 2015Glunz et al.
2015/0248503September 2015Glunz et al.
2015/0248504September 2015Glunz et al.
2015/0379301December 2015Lesavich et al.
2016/0321530November 2016Troy
2016/0321654November 2016Lesavich et al.
2016/0335727November 2016Jimenez
2017/0132567May 2017Glunz

Зарубежные патентные документы

204028926Dec 2014CN
WO2011051639Jan 2011WO
WO2011051639May 2011WO
WO2014111587Jul 2014WO
PCT/US2015/047229Aug 2015WO
PCTUS2015047229Aug 2015WO
WO2016/033345Mar 2016WO

Другие источники


Courtemanche, C. "Mobilizing Project Communications and Collaboration" Construction Executive: Tech Trends (2014) available from <http://enewsletters.constructionexec.com/techtrends/2014/10/mobilizin- g-project-communications-and-collaboration/>. cited by examiner .
Harty, J. & Laing, R. "Facilitating Meaningful Collaboration in Architectural Design through the adoption of BIM" Proceedings of 2013 IEEE 17th Int'l Conf. on Computer Supported Cooperative Work in Design, pp. 502-508 (2013). cited by examiner .
Bryde, D., et al. "The project benefits of Building Information Modelling (BIM)" Int'l J. Project Management, vol. 31, pp. 971-980 (2013). cited by examiner .
Kahn, S., et al. "Beyond 3D `as-Built` Information using Mobile AR enhancing the Building Lifecycle Management" IEEE 2012 International Conference on Cyberworlds, pp. 29-36 (2012). cited by examiner .
Wikipedia "Hyperlink" (Nov. 2014) available at <https://en.wikipedia.org/w/index.php?title=Hyperlink&oldid=635419813&- gt; (Year: 2014). cited by examiner .
Lkretschmar "Hyperlinks are automatically turning blue and I don't want that" Adobe Forum (Feb. 2014) available at <https:// forums.adobe.com/thread/1395695> (Year: 2014). cited by examiner .
Bhardwaj, S., et al. "Cloud Computing: A Study of Infrastructure as a Service (IaaS)" Int'l J. Engineering & Information Tech., vol. 2, No. 1, pp. 60-63 (2010) (Year: 2010). cited by examiner .
Nov. 23, 2015, PCT/US2015/047,229--Partial PCT International Search Report (1 page) Anguerlis Technologies, LLC--Glunz, et al. cited by applicant .
Jul. 19, 2012, 4D BIM, available at https://en.wikipedia.org/wiki/4D_BIM. cited by applicant .
What is BIM?, available at https://www.autodesk.com/soluions/bim. cited by applicant.

Главный эксперт: Hann; Jay
Уполномоченный, доверенный или фирма: Lesavich High-Tech Law Group, S.C. Lesavich; Stephen


ФОРМУЛА ИЗОБРЕТЕНИЯ



I claim:

1. A method for automated modeling program collaboration, comprising: collecting electronic collaborative information from a camera component, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on a mobile target network device with one or more processors along a physical path at a physical location for one or more actual physical components at the physical location, the electronic collaboration information including: exact view points or exact physical location coordinates at physical locations of the one or more actual physical components along the physical path and further including electronic notations comprising: (1) any actual physical component that is missing and needs to be included at the physical location, (2) any physical component that is damaged and needs to be replaced at the physical location, (3) any physical component that is an incorrect component type and needs to be replaced at the physical location, (4) any physical component that is a new composite physical component that did not previously exist in the 3D modeling program comprising a plurality of individual components requiring a new composite virtual X-dimensional (XD) object model in the 3D modeling program, and (5) any physical component that is a new physical component that did not previously exist in the 3D modeling program requiring a new virtual XD object model in the 3D modeling program; and selected ones of the one or more actual physical components at the physical location in the collected collaborative information corresponding to new virtual components and new composite virtual components that do not yet exist as existing native virtual components in the 3D modeling program for which new and new composite XD object models require creation in the 3D modeling program for the physical location and are added thereto by the 3D modeling program; sending a first collaboration message from the target network device to a project management application on a three-dimensional (3D) modeling program executing on a server network device with one or more processors via a communications network including the collected electronic collaborative information; receiving the first collaboration message on the project management application on a three-dimensional (3D) modeling program executing on the server network device with one or more processors via the communications network from the mobile target network device with one or more processors, the first collaboration message including electronic collaborative information for the one or more actual physical components at the physical location associated with a set of one or more virtual components in one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program; selecting on the project management application, one or more blank generic XD object model templates for creating one or more preliminary XD model objects for storing the selected ones of the one or more actual physical components that correspond to the new virtual components and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program, wherein the one or more blank generic XD object model templates include information fields to create a final new and a final new composite XD object models that require less data storage and require less processing power than existing XD object models created from object model templates that already exist in the 3D modeling program; creating on a library application associated with the project management application, one or more final new and final new composite XD object models that correspond to the new and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program with the selected one or more blank generic XD object model templates and information from the first collaboration message, wherein the library application dynamically changes the one or more final new and final new composite XD object models anytime the collected electronic collaboration information is changed; creating from the project management application on the server network device a collaboration object including: (1) the set of one or more virtual components included in the one or more virtual final new non-native, final new composite, and existing X-dimensional (XD) object models for the selected project in the 3D modeling program associated with the one or more actual physical components at the physical location (2) physical collaboration information comprising: an exact physical view point or exact physical location coordinates along the physical path for the one or more actual physical components at the physical location collected by the mobile network target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; and (3) virtual collaboration information comprising: collecting in the 3D modeling program an exact virtual view point or exact virtual location coordinates along a virtual path for the set of one or more virtual components at a virtual location in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models in the 3D modeling program, the virtual path at the virtual location in the 3D modeling program corresponding to the physical path at the physical location; storing from the project management application the received electronic collaboration information from the first collaboration message in the created collaboration object; creating from the project management application an electronic communications path link to the created collaboration object; inserting the electronic communications path link from the project management application into a second collaboration message to the created collaboration object changing one or more visual display characteristics of the electronic communications path link from the project management application to visually indicate collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program; sending from the project management application to the mobile target network device via the communications network the second collaboration message indicating collaboration information is available for the one or more final new non-native, final new composite, and existing XD objects models on the 3D modeling program, the electronic communications path link in the second collaboration message providing collaboration information for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, executing the electronic communications path link in the second collaboration message into the created collaboration object providing (1) the physical collaboration information comprising: the exact physical view point or exact physical location coordinates for the one or more actual physical components along the physical path at the physical location collected by the mobile target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; (2) the virtual collaboration information comprising the exact virtual view point or exact virtual location coordinates for the set of one or more virtual components in the one or more final new and existing virtual X-dimensional (XD) object models in the 3D modeling program collected along the virtual path at the virtual location allowing return to the exact virtual view point or exact virtual location coordinates along the virtual path at the virtual location the virtual electronic collaborative information was collected; and (3) comparison information determined in the 3D modeling program with the physical collaboration information and the virtual collaboration information by comparing the one or more actual physical components along the physical path at the physical location for the selected project with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models along the virtual path at the virtual location in the 3D modeling program to determine any differences between locations, spacing and distances of the set of one or more virtual components and the one or more actual physical components and determining in the 3D modeling program with the physical collaboration information and the virtual collaboration information whether any of the actual physical components are missing and need to be included at the physical location, are damaged and need replacement at the physical location, and are an incorrect component type and need replacement at the physical location for the selected project, thereby providing with 3D modeling program a confirmation of a one-to-one correlation between all actual physical objects at the physical location and all virtual objects at the virtual location for the selected project; and sending from the project management application to one or more other network devices each with one or more processors via the communications network a third collaboration message indicating collaboration information is available via the electronic communications path link for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, thereby improving collaboration and providing two-way, multimedia collaboration between the projection management application on the server network device and the one or more other network devices.

2. The method of claim 1 wherein the 3D modeling program includes a Building Information Modeling (BIM) modeling program.

3. The method of claim 2 where the BIM modeling program includes a program for: (1) designing a building, its structure and its components with 3D models, annotating the 3D models with 2D drafting elements and accessing building information from a building models database, (2) creating 3D digital prototypes used in design, visualization and simulation of 2D and 3D products; (3) completing 2D and 3D computer-aided design (CAD); (4) providing drafting, technical drawing and 3D modeling; (5) providing design, construction and operation of infrastructure; (6) adding Graphic Information System (GIS) and spatial capabilities to CAD; (7) providing specialized solutions for handling common aspects of aesthetics and engineering during a design process of a built environment including buildings, interiors and urban areas; or (8) providing solid modeling, assembly modeling and drafting, finite element analysis, direct and parametric modeling, sub-divisional and non-uniform rational basis spline (NURBS) surfacing and numerical control (NC) and tooling functionality.

4. The method of claim 1 wherein the XD modeling object includes a modeling object for a Building Information Modeling (BIM) modeling program, the BIM modeling program including one or more programs for: (1) designing a building, its structure and its components with 3D models, annotating the 3D models with 2D drafting elements and accessing building information from a building models database, (2) creating 3D digital prototypes used in design, visualization and simulation of 2D and 3D products; (3) completing 2D and 3D computer-aided design (CAD); (4) providing drafting, technical drawing and 3D modeling; (5) providing design, construction and operation of infrastructure; (6) adding Graphic Information System (GIS) and spatial capabilities to CAD; (7) providing specialized solutions for handling common aspects of aesthetics and engineering during a design process of a built environment including buildings, interiors and urban areas; or (8) providing solid modeling, assembly modeling and drafting, finite element analysis, direct and parametric modeling, sub-divisional and non-uniform rational basis spline (NURBS) surfacing and numerical control (NC) and tooling functionality.

5. The method of claim 1 wherein the mobile target network device includes a tablet computer, smart phone, personal digital/data assistant (PDA), digital camera, portable game console, wearable network device, unmanned aerial vehicle (UAV), an unmanned ground vehicle (UGV) or a combination UGV with UAV.

6. The method of claim 1 wherein the bar code reader includes a QR bar code reader.

7. The method of claim 1 wherein the location identifier component includes a Global Positioning System (GPS) location identifier component.

8. The method of claim 7 wherein the GPS location identifier component provides longitude and latitude or longitude, latitude and elevation, location coordinate information.

9. The method of claim 1 wherein the received collaborative electronic information includes electronic text, graphical information, digital pictures, audio information, video information, RFID information, bar code information or location coordinate information.

10. The method of claim 1 wherein the communications network includes a cloud communications network comprising: one or more public communications networks, one or more private communications networks, one or more community networks, or one or more hybrid networks; and the cloud communications network including: a cloud computing Infrastructure as a Service (IaaS), a cloud Platform as a Service (PaaS), and Specific cloud software services as a Service (SaaS), including project management services for multi-media collaboration available from the project management application available from existing 3D modeling programs.

11. The method of claim 1 further comprising: associating the one or more created final new and final new composite X-Dimensional BIM object models in the 3D modeling program with the one or more actual physical components and the electronic collaboration information from the received first message.

12. The method of claim 1 wherein the server network device and the mobile target network device include one or more wireless communications interfaces comprising: cellular telephone, 802.11a, 802.11b, 802.11g, 802.11n, 802.15.4 (ZigBee), Wireless Fidelity (Wi-Fi), Wi-Fi Aware, Worldwide Interoperability for Microwave Access (WiMAX), ETSI High Performance Radio Metropolitan Area Network (HIPERMAN), aviation communications, Near Field Communications (NFC), Machine-to-Machine (M2M), Bluetooth or infra data association (IrDA) wireless communication interfaces.

13. The method of claim 1 further comprising: creating automatically on the project management application a new set of architectural drawings, shop drawings, or manufacturing drawings including the received electronic collaboration information.

14. The method of claim 1 further comprising: displaying from management application on one or more other target network devices each with one or more processors via the communications network indicating collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program.

15. The method of claim 1 further comprising: collecting a plurality of information from the camera, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on the mobile target network device about the plurality of actual physical components at the physical location for the desired project, wherein the plurality of actual physical components correspond to the plurality of virtual components for one or more X-dimensional (XD) object models that already existed in the three-dimensional (3D) modeling program as existing XD object models and correspond to the new virtual components and new composite virtual components; and sending in real-time the collected plurality of information from the mobile network device via the communications network to the project management application executing from the 3D modeling program on the server network device.

16. The method of claim 1 further comprising: collecting actual collaborative information from an actual network device with a camera component including an actual camera point-of-view viewing cone and actual physical location information at an actual project site or virtual collaborative information from a virtual network device including a virtual camera point-of-view viewing cone and actual physical location information for the actual project site in a 3D modeling program; and sending the collected actual collaborative information or virtual collected collaborative information in real-time from the actual network device via the communications network or virtual network device to the project management application executing from the 3D modeling program on the server network device.

17. The method of claim 1 wherein the step of changing one or more visual display characteristics includes changing a font, color, line characteristic, shape characteristic, blinking or not blinking characteristic of the electronic communications path link.

18. The method of claim 1 wherein the first, second and third messages include an instant message (IM), short message service (SMS) message, social media message, e-mail message, data message, audio message or video message.

19. A non-transitory computer readable medium having stored therein a plurality of instructions configured for causing one more processors on one more network devices connected to a communications network to execute the steps of: collecting electronic collaborative information from a camera component, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on a mobile target network device with one or more processors along a physical path at a physical location for one or more actual physical components at the physical location, the electronic collaboration information including: exact view points or exact physical location coordinates at physical locations of the one or more actual physical components along the physical path and further including electronic notations comprising: (1) any actual physical component that is missing and needs to be included at the physical location, (2) any physical component that is damaged and needs to be replaced at the physical location, (3) any physical component that is an incorrect component type and needs to be replaced at the physical location, (4) any physical component that is a new composite physical component that did not previously exist in the 3D modeling program comprising a plurality of individual components requiring a new composite virtual X-dimensional (XD) object model in the 3D modeling program, and (5) any physical component that is a new physical component that did not previously exist in the 3D modeling program requiring a new virtual XD object model in the 3D modeling program; and selected ones of the one or more actual physical components at the physical location in the collected collaborative information corresponding to new virtual components and new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program for which new and new composite XD object models require creation in the 3D modeling program for the physical location and are added thereto by the 3D modeling program; sending a first collaboration message from the target network device to a project management application on a three-dimensional (3D) modeling program executing on a server network device with one or more processors via a communications network including the collected electronic collaborative information; receiving the first collaboration message on the project management application on a three-dimensional (3D) modeling program executing on the server network device with one or more processors via the communications network from the mobile target network device with one or more processors, the first collaboration message including electronic collaborative information for the one or more actual physical components at the physical location associated with a set of one or more virtual components in one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program; selecting on the project management application, one or more blank generic XD object model templates for creating one or more preliminary XD model objects for storing the selected ones of the one or more actual physical components that correspond to the new virtual components and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program, wherein the one or more blank generic XD object model templates include information fields to create a final new and a final new composite XD object models that require less data storage and require less processing power than existing XD object models created from object model templates that already exist in the 3D modeling program; creating on a library application associated with the project management application, one or more final new and final new composite XD object models that correspond to the new and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program with the selected one or more blank generic XD object model templates and information from the first collaboration message, wherein the library application dynamically changes the one or more final new and final new composite XD object models anytime the collected electronic collaboration information is changed; creating from the project management application on the server network device a collaboration object including: (1) the set of one or more virtual components included in the one or more virtual final new non-native, final new composite, and existing X-dimensional (XD) object models for the selected project in the 3D modeling program associated with the one or more actual physical components at the physical location (2) physical collaboration information comprising: an exact physical view point or exact physical location coordinates along the physical path for the one or more actual physical components at the physical location collected by the mobile network target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; and (3) virtual collaboration information comprising: collecting in the 3D modeling program an exact virtual view point or exact virtual location coordinates along a virtual path for the set of one or more virtual components at a virtual location in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models in the 3D modeling program, the virtual path at the virtual location in the 3D modeling program corresponding to the physical path at the physical location; storing from the project management application the received electronic collaboration information from the first collaboration message in the created collaboration object; creating from the project management application an electronic communications path link to the created collaboration object; inserting the electronic communications path link from the project management application into a second collaboration message to the created collaboration object changing one or more visual display characteristics of the electronic communications path link from the project management application to visually indicate collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program; sending from the project management application to the mobile target network device via the communications network the second collaboration message indicating collaboration information is available for the one or more final new non-native, final new composite, and existing XD objects models on the 3D modeling program, the electronic communications path link in the second collaboration message providing collaboration information for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, executing the electronic communications path link in the second collaboration message into the created collaboration object providing (1) the physical collaboration information comprising: the exact physical view point or exact physical location coordinates for the one or more actual physical components along the physical path at the physical location collected by the mobile target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; (2) the virtual collaboration information comprising the exact virtual view point or exact virtual location coordinates for the set of one or more virtual components in the one or more final new and existing virtual X-dimensional (XD) object models in the 3D modeling program collected along the virtual path at the virtual location allowing return to the exact virtual view point or exact virtual location coordinates along the virtual path at the virtual location the virtual electronic collaborative information was collected; and (3) comparison information determined in the 3D modeling program with the physical collaboration information and the virtual collaboration information by comparing the one or more actual physical components along the physical path at the physical location for the selected project with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models along the virtual path at the virtual location in the 3D modeling program to determine any differences between locations, spacing and distances of the set of one or more virtual components and the one or more actual physical components and determining in the 3D modeling program with the physical collaboration information and the virtual collaboration information whether any of the actual physical components are missing and need to be included at the physical location, are damaged and need replacement at the physical location, and are an incorrect component type and need replacement at the physical location for the selected project, thereby providing with 3D modeling program a confirmation of a one-to-one correlation between all actual physical objects at the physical location and all virtual objects at the virtual location for the selected project; and sending from the project management application to one or more other network devices each with one or more processors via the communications network a third collaboration message indicating collaboration information is available via the electronic communications path link for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, thereby improving collaboration and providing two-way, multimedia collaboration between the projection management application on the server network device and the one or more other network devices.

20. A system for automated modeling program collaboration, comprising in combination: one or more network devices each with one or more processors connected to a communications network; one or more non-transitory computer readable mediums on the one or more network devices including a plurality of instructions in a configuration: for collecting electronic collaborative information from a camera component, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on a mobile target network device with one or more processors along a physical path at a physical location for one or more actual physical components at the physical location, the electronic collaboration information including: exact view points or exact physical location coordinates at physical locations of the one or more actual physical components along the physical path and further including electronic notations comprising: (1) any actual physical component that is missing and needs to be included at the physical location, (2) any physical component that is damaged and needs to be replaced at the physical location, (3) any physical component that is an incorrect component type and needs to be replaced at the physical location, (4) any physical component that is a new composite physical component that did not previously exist in the 3D modeling program comprising a plurality of individual components requiring a new composite virtual X-dimensional (XD) object model in the 3D modeling program, and (5) any physical component that is a new physical component that did not previously exist in the 3D modeling program requiring a new virtual XD object model in the 3D modeling program; and selected ones of the one or more actual physical components at the physical location in the collected collaborative information corresponding to new virtual components and new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program for which new and new composite XD object models require creation in the 3D modeling program for the physical location and are added thereto by the 3D modeling program; for sending a first collaboration message from the target network device to a project management application on a three-dimensional (3D) modeling program executing on a server network device with one or more processors via a communications network including the collected electronic collaborative information; for receiving the first collaboration message on the project management application on a three-dimensional (3D) modeling program executing on the server network device with one or more processors via the communications network from the mobile target network device with one or more processors, the first collaboration message including electronic collaborative information for the one or more actual physical components at the physical location associated with a set of one or more virtual components in one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program; for selecting on the project management application, one or more blank generic XD object model templates for creating one or more preliminary XD model objects for storing the selected ones of the one or more actual physical components that correspond to the new virtual components and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program, wherein the one or more blank generic XD object model templates include information fields to create a final new and a final new composite XD object models that require less data storage and require less processing power than existing XD object models created from object model templates that already exist in the 3D modeling program; for creating on a library application associated with the project management application, one or more final new and final new composite XD object models that correspond to the new and the new composite virtual components that do not yet exist as existing virtual components in the 3D modeling program with the selected one or more blank generic XD object model templates and information from the first collaboration message, wherein the library application dynamically changes the one or more final new and final new composite XD object models anytime the collected electronic collaboration information is changed; for creating from the project management application on the server network device a collaboration object including: (1) the set of one or more virtual components included in the one or more virtual final new non-native, final new composite, and existing X-dimensional (XD) object models for the selected project in the 3D modeling program associated with the one or more actual physical components at the physical location (2) physical collaboration information comprising: an exact physical view point or exact physical location coordinates along the physical path for the one or more actual physical components at the physical location collected by the mobile network target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; and (3) virtual collaboration information comprising: collecting in the 3D modeling program an exact virtual view point or exact virtual location coordinates along a virtual path for the set of one or more virtual components at a virtual location in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models in the 3D modeling program, the virtual path at the virtual location in the 3D modeling program corresponding to the physical path at the physical location; for storing from the project management application the received electronic collaboration information from the first collaboration message in the created collaboration object; for creating from the project management application an electronic communications path link to the created collaboration object; for inserting the electronic communications path link from the project management application into a second collaboration message to the created collaboration object changing one or more visual display characteristics of the electronic communications path link from the project management application to visually indicate collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program; for sending from the project management application to the mobile target network device via the communications network the second collaboration message indicating collaboration information is available for the one or more final new non-native, final new composite, and existing XD objects models on the 3D modeling program, the electronic communications path link in the second collaboration message providing collaboration information for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, executing the electronic communications path link in the second collaboration message into the created collaboration object providing (1) the physical collaboration information comprising: the exact physical view point or exact physical location coordinates for the one or more actual physical components along the physical path at the physical location collected by the mobile target device allowing the mobile target network device to return to the exact physical view point or exact physical location coordinates the electronic collaborative information was originally collected by the mobile target network device; (2) the virtual collaboration information comprising the exact virtual view point or exact virtual location coordinates for the set of one or more virtual components in the one or more final new and existing virtual X-dimensional (XD) object models in the 3D modeling program collected along the virtual path at the virtual location allowing return to the exact virtual view point or exact virtual location coordinates along the virtual path at the virtual location the virtual electronic collaborative information was collected; and (3) comparison information determined in the 3D modeling program with the physical collaboration information and the virtual collaboration information by comparing the one or more actual physical components along the physical path at the physical location for the selected project with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models along the virtual path at the virtual location in the 3D modeling program to determine any differences between locations, spacing and distances of the set of one or more virtual components and the one or more actual physical components and determining in the 3D modeling program with the physical collaboration information and the virtual collaboration information whether any of the actual physical components are missing and need to be included at the physical location, are damaged and need replacement at the physical location, and are an incorrect component type and need replacement at the physical location for the selected project, thereby providing with 3D modeling program a confirmation of a one-to-one correlation between all actual physical objects at the physical location and all virtual objects at the virtual location for the selected project; for sending from the project management application to one or more other network devices each with one or more processors via the communications network a third collaboration message indicating collaboration information is available via the electronic communications path link for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more final new non-native, final new composite, and existing virtual X-dimensional (XD) object models for the selected project in the 3D modeling program, thereby improving collaboration and providing two-way, multimedia collaboration between the projection management application on the server network device and the one or more other network devices; for creating automatically on the project management application a new set of architectural drawings, shop drawings, or manufacturing drawings including the received electronic collaboration information; for displaying from management application on one or more other target network devices each with one or more processors via the communications network indicating collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program; for associating the created new X-Dimensional BIM object model for the 3D modeling program with one or more actual physical components and the electronic collaboration information from the received first message; for collecting a plurality of information from the camera, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on the mobile target network device about the plurality of actual physical components at the physical location for the desired project, wherein the plurality of actual physical components correspond to the plurality of virtual components for one or more X-dimensional (XD) object models that already exist in the three-dimensional (3D) modeling program or correspond to new virtual components that do not yet exist in the 3D modeling program for which new XD object models require creation in the 3D modeling program; for sending in real-time the collected plurality of information from the mobile network device via the communications network to the project management application executing from the 3D modeling program on the server network device; for collecting actual collaborative information from an actual network device with a camera component including an actual camera point-of-view viewing cone and actual physical location information at an actual project site or virtual collaborative information from a virtual network device including a virtual camera point-of-view viewing cone and actual physical location information for the actual project site in a 3D modeling program; and for sending the collected actual collaborative information or virtual collected collaborative information in real-time from the actual network device via the communications network or virtual network device to the project management application executing from the 3D modeling program on the server network device.


ОПИСАНИЕ




ОБЛАСТЬ ТЕХНИКИ



This invention relates to creating three dimensional models that are used to create actual products. More specifically, it relates to a method and system for Global Positioning System (GPS) enabled model and site interaction for Building Information Modeling (BIM) and other design platforms.


ПРЕДПОСЫЛКИ СОЗДАНИЯ ИЗОБРЕТЕНИЯ



Building Information Modeling (BIM) is a process including the generation and management of digital representations of physical and functional characteristics of physical spaces. Building Information Models (BIMs) are files (often but not always in proprietary formats and containing proprietary data) which can be exchanged or networked to support decision-making.

Current BIM software is used by individuals, businesses and government authorities who plan, design, construct, operate and maintain diverse physical infrastructures, from water, wastewater, electricity, gas, refuse and communication utilities to roads, bridges and ports, from houses, apartments, schools and shops to offices, factories, warehouses and prisons, etc.

However, there are a number of problems associated with creating three dimensional models (3D) from two dimensional (2D) data building information modeling (BIM).

One problem is that poor BIM software interoperability has long been regarded as an obstacle to industry efficiency in general and to BIM adoption in particular. In August 2004 the US National Institute of Standards and Technology (NIST) issued a report which conservatively estimated that $15.8 billion was lost annually by the U.S. capital facilities industry due to inadequate interoperability arising from "the highly fragmented nature of the industry, the industry's continued paper based business practices, a lack of standardization, and inconsistent technology adoption among stakeholders".

Another problem is existing BIM 3D modeling programs cannot be easily extended for new products.

Another problem is that it is not easy to create new 3D models for existing 3D modeling programs.

Another problem is that new 3D models used on one 3D modeling program are not in a format that is directly compatible with a second 3D modeling program.

Another problem is that 3D modeling programs file formats are proprietary and are difficult to create new 3D models for.

Another problems is that it is not easy to create composite 3D models from plural 3D object models from different sources.

Another problem is that with existing 3D modeling platforms it is hard to cooperate with other users.

Another problems is that with existing 3D modeling platforms it is hard to collect useful analytics.

Another problem is that many times actual physical objects are added to a job site that do not have corresponding virtual objects in a 3D modeling program.

Another problem is that physical objects may not exist at a job site or be damaged and need replacement. There is no way to determine this from a 3D modeling program.

There have been attempts to solve some of the problems associated with adding news models for BIM. For example, U.S Published Application No. US 20100070241A1 published by Per-Olav Opdahl teaches "A computer-implemented method for producing a data representation of a specific building, in particular in the form of a house instance building information model (house instance BIM), by using a processor. A set of house product configuration data and a configuration rule instruction are retrieved, a first configuration input is received, the first configuration input triggers (a) the application of the at least one configuration rule instruction to the set of house product configuration data by using a configuration program that is executed in the processor to instantiate a configured house instance structure (HIS) formed of a plurality of HIS elements, or (b) the retrieval of a pre-instantiated house instance structure (HIS) and the application of the at least one configuration rule instruction to the set of house product configuration data and to the pre-instantiated house instance structure by using a configuration program that is executed in the processor to produce a configured house instance structure (HIS) formed of a plurality of HIS elements. From a product BIM (PBIM) element data store (160) a plurality of PBIM elements are retrieved, each of which corresponds to a respective HIS element among the plurality of HIS elements. A BIM element instance is created for each retrieved PBIM element by applying to the retrieved PBIM element a parameter value or a relational property on the basis of information carried by the respective HIS element. A house instance BIM is formed by assembling the created BIM element instances."

U.S. Published Patent Application US20120215500A1 published by Elisa Ciuti teaches "A method and associated device relate to the enhancement of a digital model of a building using a computer, wherein: a) the digital model is analyzed in order to identify the elements that make up the building and have specific characteristics in the building; b) construction products which have properties that match said characteristics are defined; and c) the digital model is enhanced by adding, for every element that makes up the model, data from at least one list of suitable construction products."

U.S. Published Patent Application US20120310906A1 published by Miller et al. teaches "Systems and methods are disclosed for tracking the content of building information models. Embodiments include various computer-implemented methods for the tagging of BIM information, and for the monitoring of modification events involving BIM information. In addition, activities involving physical elements associated with BIM information are tracked. Providers of BIM content can tag content prior to distribution, enabling an internet-based service to track usage, enabling improved service to consumers of that BIM content. Designers can tag BIM content and BIM elements for tracking during the useful life of a building. Internet based messaging protocols can be used for communication between web services, client services and client applications. Monitoring and communication services function unattended. The system includes integration with BIM design applications, as well as stand-alone, end user system applications and browser interfaces. Analytic tools can be used to report on the tracking data."

U.S. Published Patent Application US20120310906A1 published by Arnold et al. teaches "A building information management (BIM) system is provided with a library platform that supports a toolset with novel functionality. Embodiments of the invention provide a library of products that can be used in a BIM and provide a virtual product set with improved functionality and more detailed information about the products. The library of products includes virtual products that comprise parametrically described data objects. The toolset includes an editor with which the virtual products can be edited and modified. The library of virtual products can be configured for interoperability with multiple BIM systems."

U.S. Published Patent Application US 20130303193A1 published by Pallavi et al. teaches "Systems, methods, and computer-readable and executable instructions are provided for implementing a BIM-aware location based application on a mobile device. Implementing a BIM-aware location based application on a mobile device can include displaying a floor plan of a building on the mobile device. Implementing a BIM-aware location based application on a mobile device can also include implementing a number of BIM equipment representations throughout the floor plan of the building on the mobile device. Implementing a BIM-aware location based application on a mobile device can also include providing real-time status information for the number of BIM equipment representations on the mobile device. Furthermore, implementing a BIM-aware location based application on a mobile device can include updating the floor plan with the implanted number of BIM equipment representations and real-time status information based on a determined location of the mobile device."

U.S. Published Patent Application US20140304107A1 published by Clarke teaches "The present invention relates to systems, methods, and devices for using RFID-tagged items for omnichannel shopping and automatically reading and locating those items. Robots for automated RFID reading are disclosed. The present invention discloses Webrooming 2.0 (WR2.0) which will offer shoppers new views and tools. WR2.0 offers shoppers a bird's eye view of equivalent items in local retail stores. WR2.0 tools empower shoppers with preemptive purchasing power: the ability to redirect their online purchases from any online web store to a local retail store."

U.S. Published Patent Application US20150248503 published by Glunz et al. teaches "a method and system for creating three dimensional (3D) models from two dimensional (2D) data for building information modeling (BIM). The method and system allow new, 2D, 3D and higher dimensional models to be created for existing 3D modeling programs (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, MICROSTATION, ARCHICAD, etc.). The new models are used to enhance and extend existing 3D modeling programs. The new models can also be used to directly create physical objects (e.g., windows, doors, etc.) represented by the new models with robots, 3D printers and manufacturing machines."

U.S. Published Patent Application US20150248504 published by Glunz et al. teaches "a method and system for creating composite three dimensional (3D) models for building information modeling (BIM). The method and system provides the creation of new composite 3D and higher dimensional models from plural different 3D models from plural different manufacturers for existing 3D modeling (e.g., AUTODESK REVIT, AUTODESK INVENTOR, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.) The new composite 3D models are used to enhance and extend existing 3D modeling programs. The new models can also be used to directly create new physical objects (e.g., windows, doors, etc.) that never existed before with robots, 3D printers and manufacturing machines."

Chinese Patent Application CN204028926U published by Yinshi You and Dong Jie teaches "The utility model relates to an automatic personnel positioning system device, in particular to a BIM (Building Information Modeling)-based RFID (Radio Frequency Identification Technology) automatic site personnel positioning system device. The device comprises an RFID functional safety helmet provided with a chip, a BIM-integrated data memory, a network transmission module, a central processing unit, an operating terminal and a read receiver, and is characterized in that the RFID functional safety helmet provided with the chip is connected with the read receiver, wherein the read receiver is connected with the BIM-integrated data memory; the BIM-integrated data memory is connected with the central processing unit and the operating terminal through the network transmission module. According to the device, by using the interaction of data and a model, the positioning of construction personnel on site is simulated in the BIM system through presentation of the model and the data."

International Patent Application WO 2011051639 A1 published by Elisa Cuiti teaches "The present invention relates to the enhancement of a digital model of a building using computer means, wherein: a) the digital model is analysed (S1, S3) in order to identify the elements that make up the building and have specific characteristics in the building; b) construction products which have properties that match said characteristics are defined (S5); and c) the digital model is enhanced (S7) by adding, for every element that makes up the model, data from at least one list of suitable construction products."

U.S. Issued U.S. Pat. No. 8,427,473 that issued to Elsberg, et al. teaches "An electronically implemented method of on-site visualization of building information model data, the method constituted of: loading a 3D scene comprising building information model data; receiving real time positional information comprising geographic coordinate information and orientation information; rendering a pseudo-realistic image of the loaded 3D scene responsive to the real time positional information; and displaying the rendered pseudo-realistic image. Preferably the method further provides for displaying engineering information of the building information model responsive to the real time positional information. Preferably, the pseudo-realistic image provides shadowing responsive to real time chronographic information and the received real time positional information."

U.S. Issued U.S. Pat. No. 8,484,231 that issued to Li et al. teaches "A process includes mapping a data format in an object in a source schema to a data format in an object in a destination schema. The process includes defining an attribute mapping, defining a relation between the data format in the object in the source and the data format in the object in the destination, mapping the data format in the object in the source to the data format in the object in the destination, and converting the data format in the object in the source to another data format within the source. When the object in the source has no analog in destination, a foreign object is introduced into the destination, and when the object in the destination refers to one or more dependent objects, one or more instances of referred objects are generated according to a predefined policy in the mapping."

However, none of these solutions solves all of the problems associated with actual and physical objects used with BIM.

Thus, it is desirable to solve some of the problems associated creating using and collaborating on three dimensional (3D) models for BIM.


СУЩНОСТЬ ИЗОБРЕТЕНИЯ



In accordance with preferred embodiments of the present invention, some of the problems associated creating X-dimensional (XD) models for building information modeling (BIM) are overcome. A method and system method and system for Global Positioning System (GPS) enabled model and site interaction for Building Information Modeling (BIM) and other design platforms is presented.

Collaboration information for actual physical objects at physical locations is automatically collected and associated with virtual objects in virtual object models in a three-dimensional (3D) object modeling programs for a selected project or new virtual objects that did not previously exist are created in the 3D modeling program and associated with the actual physical objects that have physically added at a project site. The method and system allows two-way, real-time and static collaboration between native and new composite XD (e.g., 3D, or lower or higher dimensional) object models from within existing 3D modeling BIM programs (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, etc.) and the actual physical objects at the actual physical locations.

The foregoing and other features and advantages of preferred embodiments of the present invention will be more readily apparent from the following detailed description. The detailed description proceeds with references to the accompanying drawings.


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



Preferred embodiments of the present invention are described with reference to the following drawings, wherein:

FIG. 1 is a block diagram illustrating an exemplary electronic information display system;

FIG. 2 is a block diagram illustrating an exemplary electronic information display system;

FIG. 3 is a block diagram illustrating an exemplary networking protocol stack;

FIG. 4 is block diagram illustrating an exemplary cloud communications network;

FIG. 5 is a block diagram illustrating an exemplary cloud storage object;

FIG. 6 is a block diagram illustrating an exemplary QR bar code;

FIG. 7 is a block diagram illustrating wearable network devices;

FIGS. 8A and 8B are a flow diagram illustrating a method for creating three dimensional (3D) objects from two dimensional (2D) data;

FIG. 9 is a flow diagram illustrating a method for creating three dimensional (3D) objects from two dimensional (2D) data;

FIGS. 10A and 10B are a flow diagram illustrating a method for creating three dimensional (3D) objects from two dimensional (2D) data with cloud computing; a

FIG. 11 is a block diagram illustrating a data flow for the method of FIG. 8;

FIGS. 12A and 12B are a flow diagram illustrating a method for creating composite three-dimensional (3D) object models;

FIG. 13 is a flow diagram illustrating a method for creating composite three-dimensional (3D) object models;

FIG. 14 is a flow diagram illustrating a method for creating composite three-dimensional (3D) object models;

FIG. 15 is a flow diagram illustrating a method for creating composite three-dimensional (3D) object models;

FIG. 16 is a flow diagram illustrating a method for creating composite three-dimensional (3D) object models;

FIG. 17 is a block diagram illustrating an exemplary three-dimensional (3D) object model for a portion of a building;

FIG. 18 is a flow diagram illustrating a method for creating composite higher dimensional object models from composite 3D object models;

FIGS. 19A and 19B are a flow diagram illustrating a method for collaborating on X-dimensional (XD) object models;

FIG. 20 is a block diagram illustrating exemplary collaboration information for 3D object models in a 3D modeling program;

FIG. 21 is a flow diagram illustrating a method for collaborating on X-dimensional (XD) object models;

FIG. 22 is a block diagram illustrating exemplary collaboration analytics created for three-dimensional (3D) object models;

FIG. 23 is a block diagram illustrating exemplary collaboration on an exemplary three-dimensional (3D) object model;

FIG. 24 is a block diagram illustrating another exemplary three-dimensional (3D) object model for a building;

FIGS. 25A and 25B are a flow diagram illustrating a method for automated building information modeling (BIM) collaboration;

FIG. 26 is a block diagram illustrating exemplary collaboration information for 3D object models in a 3D modeling program; and

FIG. 27 is a block diagram illustrating a path of a mobile target device through an XD object model in a 3D modeling program;

FIG. 28 is a flow diagram illustrating a method for automated building information modeling (BIM) collaboration;

FIG. 29 is a flow diagram illustrating a method for automated building information modeling (BIM) collaboration; and

FIG. 30 is a flow diagram illustrating a method for automated building information modeling (BIM) collaboration.


ПОДРОБНОЕ ОПИСАНИЕ ИЗОБРЕТЕНИЯ



Exemplary Cloud Electronic Information Storage and Retrieval System

FIG. 1 is a block diagram illustrating an exemplary electronic information display system 10. The exemplary electronic system 10 includes, but is not limited to, one or more target network devices 12, 14, 16 (only three of which are illustrated) each with one or more processors and each with a non-transitory computer readable medium.

The one or more target network devices 12, 14, 16 include, but are not limited to, multimedia capable desktop and laptop computers, tablet computers, facsimile machines, mobile phones, non-mobile phones with and/or without displays, three-dimensional (3D) printer, robots, smart phones, Internet phones, Internet appliances, personal digital/data assistants (PDA), two-way pagers, digital cameras, portable game consoles (Play Station Portable by Sony, Game Boy by Sony, Nintendo DSI, etc.), non-portable game consoles (Xbox by Microsoft, Play Station by Sony, Wii by Nintendo, etc.), cable television (CATV), satellite television (SATV) and Internet television set-top boxes, digital televisions including high definition television (HDTV), three-dimensional (3DTV) televisions, wearable network devices 106-112 (FIG. 7), unmanned aerial vehicles (UAVs) 27 (FIG. 1) (i.e., drones, etc.), unmanned ground vehicle (UGV) 29 (FIG. 1) and/or other types of network devices.

In one embodiment, the unmanned ground vehicle (UGV) 29' includes a detachable/re-attachable UAV 27' (FIG. 7). In one embodiment, the UGV 29 automatically launches and lands the UAV 27' on and off a portion of the UGV 29'. The UGV 29, 29' illustrated in FIGS. 1 and 7 is a continuous track UGV 29, 29' where the continuous tracks are used to provide motion. However, the present invention is not limited to such an embodiment and other embodiments can be used to practice the invention (e.g., wheeled robots, robots with legs, etc.).

The one or more smart network devices 12, 14, 16 include smart phones such as the iPhone by Apple, Inc., Blackberry Storm and other Blackberry models by Research In Motion, Inc. (RIM), Droid by Motorola, Inc. HTC, Inc. other types of smart phones, etc. However, the present invention is not limited to such smart phone devices, and more, fewer or other devices can be used to practice the invention.

A "smart phone" is a mobile phone that offers more advanced computing ability and connectivity than a contemporary basic feature phone. Smart phones and feature phones may be thought of as handheld computers integrated with a mobile telephone, but while most feature phones are able to run applications based on platforms such as Java ME, a smart phone usually allows the user to install and run more advanced applications. Smart phones and/or tablet computers run complete operating system software providing a platform for application developers.

The operating systems include the iPhone OS, Android, Windows, etc. iPhone OS is a proprietary operating system for the Apple iPhone. Andriod is an open source operating system platform backed by Google, along with major hardware and software developers (such as Intel, HTC, ARM, Motorola and Samsung, etc.), that form the Open Handset Alliance.

The one or more smart network devices 12, 14, 16 include tablet computers such as the iPad, by Apple, Inc., the HP Tablet, by Hewlett Packard, Inc., the Playbook, by RIM, Inc., the Tablet, by Sony, Inc.

A 3D printer 39 (FIG. 1) include 3D printing or "Additive manufacturing." 3D printing is a process of making a three-dimensional solid object of virtually any shape from a digital model. 3D printing is achieved using an "additive process," where successive layers of material are laid down in different shapes. 3D printing is also considered distinct from traditional machining techniques, which mostly rely on the removal of material by methods such as cutting or drilling and are "subtractive" processes.

In one embodiment, a 3D printer 39 is a limited type of industrial robot that is capable of carrying out an additive process under computer control. The 3D printing technology is used for both prototyping and distributed manufacturing with applications in architecture, construction (AEC), industrial design, automotive, aerospace, military, engineering, civil engineering, dental and medical industries, biotech (human tissue replacement), fashion, footwear, jewelry, eyewear, education, geographic information systems, food, and/or many other fields.

The target network devices 12, 14, 16 are in communications with a cloud communications network 18 or a non-cloud computing network 18' via one or more wired and/or wireless communications interfaces. The cloud communications network 18, is also called a "cloud computing network" herein and the terms may be used interchangeably.

The plural target network devices 12, 14, 16 request desired electronic content 13, 15, etc. such as 3D models for specific 3D modeling programs stored on the cloud communications network 18 or non-cloud communications network 18'

The cloud communications network 18 and non-cloud communications network 18' includes, but is not limited to, communications over a wire connected to the target network devices, wireless communications, and other types of communications using one or more communications and/or networking protocols.

Plural server network devices 20, 22, 24, 26 (only four of which are illustrated) each with one or more processors and a non-transitory computer readable medium include one or more associated databases 20', 22', 24', 26'. The plural network devices 20, 22, 24, 26 are in communications with the one or more target devices 12, 14, 16 via the cloud communications network 18 and non-cloud communications network 18'.

Plural server network devices 20, 22, 24, 26 (only four of which are illustrated) are physically located on one more public networks 76 (See FIG. 4), private networks 72, community networks 74 and/or hybrid networks 78 comprising the cloud network 18.

One or more server network devices (e.g., 20, etc.) securely stores one or more cloud content location maps 17 and other plural server network devices (e.g., 22, 24, 26, etc.) store portions 13', 15' of desired electronic content 13, 15 as cloud storage objects 82 (FIG. 5) as is described herein.

The plural server network devices 20, 22, 24 26, include, but are not limited to, manufacturing machines 35, 3D printers 39, robots 41, World Wide Web servers, Internet servers, search engine servers, vertical search engine servers, social networking site servers, file servers, other types of electronic information servers, and other types of server network devices (e.g., edge servers, firewalls, routers, gateways, etc.).

The plural server network devices 20, 22, 24, 26 also include, but are not limited to, network servers used for cloud computing providers, etc.

The cloud communications network 18 and non-cloud communications network 18' includes, but is not limited to, a wired and/or wireless communications network comprising one or more portions of: the Internet, an intranet, a Local Area Network (LAN), a wireless LAN (WiLAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a Wireless Personal Area Network (WPAN) and other types of wired and/or wireless communications networks 18.

The cloud communications network 18 and non-cloud communications network 18' includes one or more gateways, routers, bridges and/or switches. A gateway connects computer networks using different network protocols and/or operating at different transmission capacities. A router receives transmitted messages and forwards them to their correct destinations over the most efficient available route. A bridge is a device that connects networks using the same communications protocols so that information can be passed from one network device to another. A switch is a device that filters and forwards packets between network segments based on some pre-determined sequence (e.g., timing, sequence number, etc.).

An operating environment for the network devices of the exemplary electronic information display system 10 include a processing system with one or more high speed Central Processing Unit(s) (CPU), processors, one or more memories and/or other types of non-transitory computer readable mediums. In accordance with the practices of persons skilled in the art of computer programming, the present invention is described below with reference to acts and symbolic representations of operations or instructions that are performed by the processing system, unless indicated otherwise. Such acts and operations or instructions are referred to as being "computer-executed," "CPU-executed," or "processor-executed."

It will be appreciated that acts and symbolically represented operations or instructions include the manipulation of electrical information by the CPU or processor. An electrical system represents data bits which cause a resulting transformation or reduction of the electrical information or biological information, and the maintenance of data bits at memory locations in a memory system to thereby reconfigure or otherwise alter the CPU's or processor's operation, as well as other processing of information. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to the data bits.

The data bits may also be maintained on a non-transitory computer readable medium including magnetic disks, optical disks, organic memory, and any other volatile (e.g., Random Access Memory (RAM)) or non-volatile (e.g., Read-Only Memory (ROM), flash memory, etc.) mass storage system readable by the CPU. The non-transitory computer readable medium includes cooperating or interconnected computer readable medium, which exist exclusively on the processing system or can be distributed among multiple interconnected processing systems that may be local or remote to the processing system.

Exemplary Electronic Content Display System

FIG. 2 is a block diagram illustrating an exemplary electronic content information display system 28. The exemplary electronic information system display system 12' includes, but is not limited to a target network device (e.g., 12, etc.) with an application 30 and a display component 32. The application 30 presents a graphical user interface (GUI) 34 on the display 32 component. The GUI 32 presents a multi-window 36, 38, etc. (only two of which are illustrated) interface to a user.

In one embodiment of the invention, the application 30 is a software application. However, the present invention is not limited to this embodiment and the application 30 can be hardware, firmware, hardware and/or any combination thereof. In one embodiment, the application 30 is a mobile application for a smart phone, electronic tablet or other mobile network device. In another embodiment, the application 30, 30', 30'' is a cloud application used on a cloud communications network 18. However, the present invention is not limited these embodiments and other embodiments can be used to practice the invention

In another embodiment, a portion of the application 30 is executing on the target network devices 12, 14, 16 and another portion of the application 30', 30'' is executing on the server network devices 20, 22, 24, 26. The applications also include one or more library applications 31. However, the present invention is not limited these embodiments and other embodiments can be used to practice the invention.

Exemplary Networking Protocol Stack

FIG. 3 a block diagram illustrating a layered protocol stack 38 for network devices in the electronic information display system 10. The layered protocol stack 38 is described with respect to Internet Protocol (IP) suites comprising in general from lowest-to-highest, a link 42, network 44, transport 48 and application 56 layers. However, more or fewer layers could also be used, and different layer designations could also be used for the layers in the protocol stack 38 (e.g., layering based on the Open Systems Interconnection (OSI) model including from lowest-to-highest, a physical, data-link, network, transport, session, presentation and application layer.).

The network devices 12, 14, 16, 20, 22, 24, 26 are connected to the communication network 18 with Network Interface Card (NIC) cards including device drivers 40 in a link layer 42 for the actual hardware connecting the network devices 12, 14, 16, 20, 22, 24, 26 to the cloud communications network 18. For example, the NIC device drivers 40 may include a serial port device driver, a digital subscriber line (DSL) device driver, an Ethernet device driver, a wireless device driver, a wired device driver, etc. The device drivers interface with the actual hardware being used to connect the network devices to the cloud communications network 18. The NIC cards have a medium access control (MAC) address that is unique to each NIC and unique across the whole cloud network 18. The Medium Access Control (MAC) protocol is used to provide a data link layer of an Ethernet, LAN system and for other network systems.

Above the link layer 42 is a network layer 44 (also called the Internet Layer for Internet Protocol (IP) suites). The network layer 44 includes, but is not limited to, an IP layer 46.

IP 46 is an addressing protocol designed to route traffic within a network or between networks. However, more fewer or other protocols can also be used in the network layer 44, and the present invention is not limited to IP 46. For more information on IP 46 see IETF RFC-791, incorporated herein by reference.

Above network layer 44 is a transport layer 48. The transport layer 48 includes, but is not limited to, an optional Internet Group Management Protocol (IGMP) layer 50, a Internet Control Message Protocol (ICMP) layer 52, a Transmission Control Protocol (TCP) layer 52 and a User Datagram Protocol (UDP) layer 54. However, more, fewer or other protocols could also be used in the transport layer 48.

Optional IGMP layer 50, hereinafter IGMP 50, is responsible for multicasting. For more information on IGMP 50 see RFC-1112, incorporated herein by reference. ICMP layer 52, hereinafter ICMP 52 is used for IP 46 control. The main functions of ICMP 52 include error reporting, reachability testing (e.g., pinging, etc.), route-change notification, performance, subnet addressing and other maintenance. For more information on ICMP 52 see RFC-792, incorporated herein by reference. Both IGMP 50 and ICMP 52 are not required in the protocol stack 38. ICMP 52 can be used alone without optional IGMP layer 50.

TCP layer 54, hereinafter TCP 54, provides a connection-oriented, end-to-end reliable protocol designed to fit into a layered hierarchy of protocols which support multi-network applications. TCP 54 provides for reliable inter-process communication between pairs of processes in network devices attached to distinct but interconnected networks. For more information on TCP 54 see RFC-793, incorporated herein by reference.

UDP layer 56, hereinafter UDP 56, provides a connectionless mode of communications with datagrams in an interconnected set of computer networks. UDP 56 provides a transaction oriented datagram protocol, where delivery and duplicate packet protection are not guaranteed. For more information on UDP 56 see RFC-768, incorporated herein by reference. Both TCP 54 and UDP 56 are not required in protocol stack 38. Either TCP 54 or UDP 56 can be used without the other.

Above transport layer 48 is an application layer 57 where application programs 58 (e.g., 30, 30', 30'' etc.) to carry out desired functionality for a network device reside. For example, the application programs 58 for the client network devices 12, 14, 16 may include web-browsers or other application programs, application program 30, while application programs for the server network devices 20, 22, 24, 26 may include other application programs (e.g., 30', 30'', etc.).

However, the protocol stack 38 is not limited to the protocol layers illustrated and more, fewer or other layers and protocols can also be used in protocol stack 38. In addition, other protocols from the Internet Protocol suites (e.g., Simple Mail Transfer Protocol, (SMTP), Hyper Text Transfer Protocol (HTTP), File Transfer Protocol (FTP), Dynamic Host Configuration Protocol (DHCP), DNS, etc.) and/or other protocols from other protocol suites may also be used in protocol stack 38.

In addition, markup languages such as HyperText Markup Language (HTML), EXtensible Markup Language (XML) and others are used.

HyperText Markup Language (HTML) is a markup language for creating web pages and other information that can be displayed in a web browser.

HTML is written in the form of HTML elements consisting of tags enclosed in angle brackets within the web page content. HTML tags most commonly come in pairs although some tags represent empty elements and so are unpaired. The first tag in a pair is the start tag, and the second tag is the end tag (they are also called opening tags and closing tags). In between these tags web designers can add text, further tags, comments and other types of text-based content.

The purpose of a web browser is to read HTML documents and compose them into visible or audible web pages. The browser does not display the HTML tags, but uses the tags to interpret the content of the page.

HTML elements form the building blocks of all websites. HTML allows images and objects to be embedded and can be used to create interactive forms. It provides a means to create structured documents by denoting structural semantics for text such as headings, paragraphs, lists, links, quotes and other items. It can embed scripts written in languages such as JavaScript which affect the behavior of HTML web pages.

EXtensible Markup Language (XML) is another markup language that defines a set of rules for encoding documents in a format that is both human-readable and machine-readable. It is defined in the XML 1.0 Specification produced by the W3C, the contents of which are incorporated by reference and several other related specifications, all free open standards.

XML a textual data format with strong support via Unicode for the languages of the world. Although the design of XML focuses on documents, it is widely used for the representation of arbitrary data structures, for example in web services. The oldest schema language for XML is the Document Type Definition (DTD). DTDs within XML documents define entities, which are arbitrary fragments of text and/or markup tags that the XML processor inserts in the DTD itself and in the XML document wherever they are referenced, like character escapes.

Preferred embodiments of the present invention include network devices and wired and wireless interfaces that are compliant with all or part of standards proposed by the Institute of Electrical and Electronic Engineers (IEEE), International Telecommunications Union-Telecommunication Standardization Sector (ITU), European Telecommunications Standards Institute (ETSI), Internet Engineering Task Force (IETF), U.S. National Institute of Security Technology (NIST), American National Standard Institute (ANSI), Wireless Application Protocol (WAP) Forum, Bluetooth Forum, or the ADSL Forum.

Wireless Interfaces

In one embodiment of the present invention, the wireless interfaces on network devices 12, 14, 16, 20, 22, 24, 26 include but are not limited to, 3G and/or 4G IEEE 802.11a, 802.11b, 802.11g, 802.11n, 802.15.4 (ZigBee), "Wireless Fidelity" (Wi-Fi), "Worldwide Interoperability for Microwave Access" (WiMAX), ETSI High Performance Radio Metropolitan Area Network (HIPERMAN) or "RF Home" wireless interfaces. In another embodiment of the present invention, the wireless sensor device may include an integral or separate Bluetooth and/or infra data association (IrDA) module for wireless Bluetooth or wireless infrared communications. However, the present invention is not limited to such an embodiment and other 802.11xx and other types of wireless interfaces can also be used.

802.11b is a short-range wireless network standard. The IEEE 802.11b standard defines wireless interfaces that provide up to 11 Mbps wireless data transmission to and from wireless devices over short ranges. 802.11a is an extension of the 802.11b and can deliver speeds up to 54 M bps. 802.11g deliver speeds on par with 802.11a. However, other 802.11XX interfaces can also be used and the present invention is not limited to the 802.11 protocols defined. The IEEE 802.11a, 802.11b and 802.11g standards are incorporated herein by reference.

Wi-Fi is a type of 802.11xx interface, whether 802.11b, 802.11a, dual-band, etc. Wi-Fi devices include an RF interfaces such as 2.4 GHz for 802.11b or 802.11g and 5 GHz for 802.11a.

802.15.4 (Zigbee) is low data rate network standard used for mesh network devices such as sensors, interactive toys, smart badges, remote controls, and home automation. The 802.15.4 standard provides data rates of 250 kbps, 40 kbps, and 20 kbps., two addressing modes; 16-bit short and 64-bit IEEE addressing, support for critical latency devices, such as joysticks, Carrier Sense Multiple Access/Collision Avoidance, (CSMA-CA) channel access, automatic network establishment by a coordinator, a full handshake protocol for transfer reliability, power management to ensure low power consumption for multi-month to multi-year battery usage and up to 16 channels in the 2.4 GHz Industrial, Scientific and Medical (ISM) band (Worldwide), 10 channels in the 915 MHz (US) and one channel in the 868 MHz band (Europe). The IEEE 802.15.4-2003 standard is incorporated herein by reference.

WiMAX is an industry trade organization formed by leading communications component and equipment companies to promote and certify compatibility and interoperability of broadband wireless access equipment that conforms to the IEEE 802.16XX and ETSI HIPERMAN. HIPERMAN is the European standard for metropolitan area networks (MAN).

The IEEE The 802.16a and 802.16g standards are wireless MAN technology standard that provides a wireless alternative to cable, DSL and T1/E1 for last mile broadband access. It is also used as complimentary technology to connect IEEE 802.11XX hot spots to the Internet.

The IEEE 802.16a standard for 2-11 GHz is a wireless MAN technology that provides broadband wireless connectivity to fixed, portable and nomadic devices. It provides up to 50-kilometers of service area range, allows users to get broadband connectivity without needing direct line of sight with the base station, and provides total data rates of up to 280 Mbps per base station, which is enough bandwidth to simultaneously support hundreds of businesses with T1/E1-type connectivity and thousands of homes with DSL-type connectivity with a single base station. The IEEE 802.16g provides up to 100 Mbps.

The IEEE 802.16e standard is an extension to the approved IEEE 802.16/16a/16g standard. The purpose of 802.16e is to add limited mobility to the current standard which is designed for fixed operation.

The ESTI HIPERMAN standard is an interoperable broadband fixed wireless access standard for systems operating at radio frequencies between 2 GHz and 11 GHz.

The IEEE 802.16a, 802.16e and 802.16g standards are incorporated herein by reference. WiMAX can be used to provide a WLP.

The ETSI HIPERMAN standards TR 101 031, TR 101 475, TR 101 493-1 through TR 101 493-3, TR 101 761-1 through TR 101 761-4, TR 101 762, TR 101 763-1 through TR 101 763-3 and TR 101 957 are incorporated herein by reference. ETSI HIPERMAN can be used to provide a WLP.

In one embodiment, the plural server network devices 20, 22, 24, 26 include a connection to plural network interface cards (NICs) in a backplane connected to a communications bus. The NIC cards provide gigabit/second (1.times.10.sup.9 bits/second) communications speed of electronic information. This allows "scaling out" for fast electronic content retrieval. The NICs are connected to the plural server network devices 20, 22, 24, 26 and the cloud communications network 18. However, the present invention is not limited to the NICs described and other types of NICs in other configurations and connections with and/or without buses can also be used to practice the invention.

In one embodiment, network devices 12, 14, 16, 20, 22, 24, 26 and wired and wireless interfaces including the NICs include "4G" components. "4G" refers to the fourth generation of wireless communications standards and speeds of 100 megabits/second to gigabits/second or more. 4G includes peak speed requirements for 4G service at least 100 Mbit/s for high mobility communication (e.g., trains, vehicles, etc.) and 1 Gbit/s for low mobility communication (e.g., pedestrians and stationary users, etc.).

4G technologies are a successor to 3G and 2G standards. The nomenclature of the generations generally refers to a change in the fundamental nature of the service. The first was the move from analogue (1G) to digital (2G) transmission. This was followed by multi-media support, spread spectrum transmission and at least 200 Kbits/second (3G). The 4G NICs include IP packet-switched NICs, wired and wireless ultra-broadband (i.e., gigabit speed) access NICs, Worldwide Interoperability for Microwave Access (WiMAX) NICs WiMAX Long Term Evolution (LTE) and/or multi-carrier transmission NICs. However, the present invention is not limited to this embodiment and 1G, 2G and 3G and/or any combination thereof, with or with 4G NICs can be used to practice the invention.

In one embodiment of the invention, the WiMAX interface includes WiMAX 4G Long Term Evolution (LTE) interfaces. The ITU announced in December 2010 that WiMAX and LTE are 4G technologies. One of the benefits of 4G LTE is the ability to take advantage of advanced topology networks including those on cloud communications networks 18 such as optimized heterogeneous networks with a mix of macrocells with low power nodes such as picocells, femtocells and new relay nodes. LTE further improves the capacity and coverage, and helps ensures user fairness. 4G LTE also introduces multicarrier technologies for ultra-wide bandwidth use, up to 100 MHz of spectrum supporting very high data rates.

In one embodiment, of the invention, the wireless interfaces also include wireless personal area network (WPAN) interfaces. As is known in the art, a WPAN is a personal area network for interconnecting devices centered around an individual person's devices in which the connections are wireless. A WPAN interconnects all the ordinary computing and communicating devices that a person has on their desk (e.g. computer, etc.) or carry with them (e.g., PDA, mobile phone, smart phone, table computer two-way pager, etc.)

A key concept in WPAN technology is known as "plugging in." In the ideal scenario, when any two WPAN-equipped devices come into close proximity (within several meters and/or feet of each other) or within a few miles and/or kilometers of a central server (not illustrated), they can communicate via wireless communications as if connected by a cable. WPAN devices can also lock out other devices selectively, preventing needless interference or unauthorized access to secure information. Zigbee is one wireless protocol used on WPAN networks such as cloud communications network 18 or non-cloud communications network 18'.

The one or more target network devices 12, 14, 16 and one or more server network devices 20, 22, 24, 26 communicate with each other and other network devices with near field communications (NFC) and/or machine-to-machine (M2M) communications.

"Near field communication (NFC)" is a set of standards for smartphones and similar network devices to establish radio communication with each other by touching them together or bringing them into close proximity, usually no more than a few centimeters. Present applications include contactless transactions, data exchange, and simplified setup of more complex communications such as Wi-Fi. Communication is also possible between an NFC device and an unpowered NFC chip, called a "tag" including radio frequency identifier (RFID) tags 99 and/or sensor.

NFC standards cover communications protocols and data exchange formats, and are based on existing radio-frequency identification (RFID) standards including ISO/IEC 14443 and FeliCa. These standards include ISO/IEC 1809 and those defined by the NFC Forum, all of which are incorporated by reference.

An "RFID tag" 99 is an object that can be applied to or incorporated into a product, animal, or person for the purpose of identification and/or tracking using RF signals.

An "RFID sensor" is a device that measures a physical quantity and converts it into an RF signal which can be read by an observer or by an instrument (e.g., target network devices 12, 14, 16, server network devices 20, 22, 24, 26, etc.)

"Machine to machine (M2M)" refers to technologies that allow both wireless and wired systems to communicate with other devices of the same ability. M2M uses a device to capture an event (such as option purchase, etc.), which is relayed through a network (wireless, wired cloud, etc.) to an application (software program), that translates the captured event into meaningful information. Such communication was originally accomplished by having a remote network of machines relay information back to a central hub for analysis, which would then be rerouted into a system like a personal computer.

However, modern M2M communication has expanded beyond a one-to-one connection and changed into a system of networks that transmits data many-to-one and many-to-many to plural different types of devices and appliances. The expansion of IP networks across the world has made it far easier for M2M communication to take place and has lessened the amount of power and time necessary for information to be communicated between machines.

However, the present invention is not limited to such wireless interfaces and wireless networks and more, fewer and/or other wireless interfaces can be used to practice the invention.

Wired Interfaces

In one embodiment of the present invention, the wired interfaces include wired interfaces and corresponding networking protocols for wired connections to the Public Switched Telephone Network (PSTN) and/or a cable television network (CATV) and/or satellite television networks (SATV) and/or three-dimensional television (3DTV), including HDTV that connect the network devices 12, 14, 16, 20, 22, 24, 26 via one or more twisted pairs of copper wires, digital subscriber lines (e.g. DSL, ADSL, VDSL, etc.) coaxial cable, fiber optic cable, other connection media or other connection interfaces. The PSTN is any public switched telephone network provided by AT&T, GTE, Sprint, MCI, SBC, Verizon and others. The CATV is any cable television network provided by the Comcast, Time Warner, etc. However, the present invention is not limited to such wired interfaces and more, fewer and/or other wired interfaces can be used to practice the invention.

Television Services

In one embodiment, the cloud applications 30, 30', 30'' provide cloud electronic content storage and retrieval services from television services over the cloud communications network 18 or non-cloud communications network 18'. The television services include digital television services, including, but not limited to, cable television, satellite television, high-definition television, three-dimensional, televisions and other types of network devices.

However, the present invention is not limited to such television services and more, fewer and/or other television services can be used to practice the invention.

Internet Television Services

In one embodiment, the cloud applications 30, 30', 30'' provide cloud electronic content storage and retrieval services from Internet television services over the cloud communications network 18 or non-cloud communications network 18' The television services include Internet television, Web-TV, and/or Internet Protocol Television (IPtv) and/or other broadcast television services.

"Internet television" allows users to choose a program or the television show they want to watch from an archive of programs or from a channel directory. The two forms of viewing Internet television are streaming content directly to a media player or simply downloading a program to a viewer's set-top box, game console, computer, or other network device.

"Web-TV" delivers digital content via broadband and mobile networks. The digital content is streamed to a viewer's set-top box, game console, computer, or other network device.

"Internet Protocol television (IPtv)" is a system through which Internet television services are delivered using the architecture and networking methods of the Internet Protocol Suite over a packet-switched network infrastructure, e.g., the Internet and broadband Internet access networks, instead of being delivered through traditional radio frequency broadcast, satellite signal, and cable television formats.

However, the present invention is not limited to such Internet Television services and more, fewer and/or other Internet Television services can be used to practice the invention.

General Search Engine Services

In one embodiment, the cloud applications 30, 30', 30'' provide cloud electronic content storage and retrieval services from general search engine services. A search engine is designed to search for information on a cloud communications network 18 or non-cloud communications network 18' such as the Internet including World Wide Web servers, HTTP, FTP servers etc. The search results are generally presented in a list of electronic results. The information may consist of web pages, images, electronic information, multimedia information, and other types of files. Some search engines also mine data available in databases or open directories. Unlike web directories, which are maintained by human editors, search engines typically operate algorithmically and/or are a mixture of algorithmic and human input.

In one embodiment, the cloud applications 30, 30', 30'' provide cloud electronic content storage and retrieval services from general search engine services. In another embodiment, the cloud applications 30, 30', 30'' provide general search engine services by interacting with one or more other public search engines (e.g., GOOGLE, BING, YAHOO, etc.) and/or private search engine services.

In another embodiment, the cloud applications 30, 30', 30'' provide electronic content storage and retrieval services from specialized search engine services, such as vertical search engine services by interacting with one or more other public vertical search engines (e.g., GALAXY.COM, etc.) and/or private search engine services.

However, the present invention is not limited to such general and/or vertical search engine services and more, fewer and/or other general search engine services can be used to practice the invention.

Social Networking Services

In one embodiment, the cloud applications 30, 30', 30'' provide cloud electronic content storage and retrieval services from one more social networking services including to/from one or more social networking web-sites (e.g., FACEBOOK, YOUTUBE, TWITTER, MY-SPACE, MATCH.COM, E-HARMONY, GROUP ON, SOCIAL LIVING, etc.). The social networking web-sites also include, but are not limited to, social couponing sites, dating web-sites, blogs, RSS feeds, and other types of information web-sites in which messages can be left or posted for a variety of social activities.

However, the present invention is not limited to the social networking services described and other public and private social networking services can also be used to practice the invention.

Security and Encryption

Network devices 12, 14, 16, 20, 22, 24, 26 with wired and/or wireless interfaces of the present invention include one or more of the security and encryptions techniques discussed herein for secure communications on the cloud communications network 18 or non-cloud communications network 18'.

Application programs 58 (FIG. 2) include security and/or encryption application programs integral to and/or separate from the applications 30, 30', 30''. Security and/or encryption programs may also exist in hardware components on the network devices (12, 14, 16, 20, 22, 24, 26) described herein and/or exist in a combination of hardware, software and/or firmware.

Wireless Encryption Protocol (WEP) (also called "Wired Equivalent Privacy) is a security protocol for WiLANs defined in the IEEE 802.11b standard. WEP is cryptographic privacy algorithm, based on the Rivest Cipher 4 (RC4) encryption engine, used to provide confidentiality for 802.11b wireless data.

RC4 is cipher designed by RSA Data Security, Inc. of Bedford, Mass., which can accept encryption keys of arbitrary length, and is essentially a pseudo random number generator with an output of the generator being XORed with a data stream to produce encrypted data.

One problem with WEP is that it is used at the two lowest layers of the OSI model, the physical layer and the data link layer, therefore, it does not offer end-to-end security. One another problem with WEP is that its encryption keys are static rather than dynamic. To update WEP encryption keys, an individual has to manually update a WEP key. WEP also typically uses 40-bit static keys for encryption and thus provides "weak encryption," making a WEP device a target of hackers.

The IEEE 802.11 Working Group is working on a security upgrade for the 802.11 standard called "802.11i." This supplemental draft standard is intended to improve WiLAN security. It describes the encrypted transmission of data between systems 802.11X WiLANs. It also defines new encryption key protocols including the Temporal Key Integrity Protocol (TKIP). The IEEE 802.11i draft standard, version 4, completed Jun. 6, 2003, is incorporated herein by reference.

The 802.11i standard is based on 802.1x port-based authentication for user and device authentication. The 802.11i standard includes two main developments: Wi-Fi Protected Access (WPA) and Robust Security Network (RSN). WPA uses the same RC4 underlying encryption algorithm as WEP.

However, WPA uses TKIP to improve security of keys used with WEP. WPA keys are derived and rotated more often than WEP keys and thus provide additional security. WPA also adds a message-integrity-check function to prevent packet forgeries.

RSN uses dynamic negotiation of authentication and selectable encryption algorithms between wireless access points and wireless devices. The authentication schemes proposed in the draft standard include Extensible Authentication Protocol (EAP). One proposed encryption algorithm is an Advanced Encryption Standard (AES) encryption algorithm.

Dynamic negotiation of authentication and encryption algorithms lets RSN evolve with the state of the art in security, adding algorithms to address new threats and continuing to provide the security necessary to protect information that WiLANs carry.

The NIST developed a new encryption standard, the Advanced Encryption Standard (AES) to keep government information secure. AES is intended to be a stronger, more efficient successor to Triple Data Encryption Standard (3DES).

DES is a popular symmetric-key encryption method developed in 1975 and standardized by ANSI in 1981 as ANSI X.3.92, the contents of which are incorporated herein by reference. As is known in the art, 3DES is the encrypt-decrypt-encrypt (EDE) mode of the DES cipher algorithm. 3DES is defined in the ANSI standard, ANSI X9.52-1998, the contents of which are incorporated herein by reference. DES modes of operation are used in conjunction with the NIST Federal Information Processing Standard (FIPS) for data encryption (FIPS 46-3, October 1999), the contents of which are incorporated herein by reference.

The NIST approved a FIPS for the AES, FIPS-197. This standard specified "Rijndael" encryption as a FIPS-approved symmetric encryption algorithm that may be used by U.S. Government organizations (and others) to protect sensitive information. The NIST FIPS-197 standard (AES FIPS PUB 197, November 2001) is incorporated herein by reference.

The NIST approved a FIPS for U.S. Federal Government requirements for information technology products for sensitive but unclassified (SBU) communications. The NIST FIPS Security Requirements for Cryptographic Modules (FIPS PUB 140-2, May 2001) is incorporated herein by reference.

RSA is a public key encryption system which can be used both for encrypting messages and making digital signatures. The letters RSA stand for the names of the inventors: Rivest, Shamir and Adleman. For more information on RSA, see U.S. Pat. No. 4,405,829, now expired and incorporated herein by reference.

"Hashing" is the transformation of a string of characters into a usually shorter fixed-length value or key that represents the original string. Hashing is used to index and retrieve items in a database because it is faster to find the item using the shorter hashed key than to find it using the original value. It is also used in many encryption algorithms.

Secure Hash Algorithm (SHA), is used for computing a secure condensed representation of a data message or a data file. When a message of any length <2.sup.64 bits is input, the SHA-1 produces a 160-bit output called a "message digest." The message digest can then be input to other security techniques such as encryption, a Digital Signature Algorithm (DSA) and others which generates or verifies a security mechanism for the message. SHA-512 outputs a 512-bit message digest. The Secure Hash Standard, FIPS PUB 180-1, Apr. 17, 1995, is incorporated herein by reference.

Message Digest-5 (MD-5) takes as input a message of arbitrary length and produces as output a 128-bit "message digest" of the input. The MD5 algorithm is intended for digital signature applications, where a large file must be "compressed" in a secure manner before being encrypted with a private (secret) key under a public-key cryptosystem such as RSA. The IETF RFC-1321, entitled "The MD5 Message-Digest Algorithm" is incorporated here by reference.

Providing a way to check the integrity of information transmitted over or stored in an unreliable medium such as a wireless network is a prime necessity in the world of open computing and communications. Mechanisms that provide such integrity check based on a secret key are called "message authentication codes" (MAC). Typically, message authentication codes are used between two parties that share a secret key in order to validate information transmitted between these parties.

Keyed Hashing for Message Authentication Codes (HMAC), is a mechanism for message authentication using cryptographic hash functions. HMAC is used with any iterative cryptographic hash function, e.g., MD5, SHA-1, SHA-512, etc. in combination with a secret shared key. The cryptographic strength of HMAC depends on the properties of the underlying hash function. The IETF RFC-2101, entitled "HMAC: Keyed-Hashing for Message Authentication" is incorporated here by reference.

An Electronic Code Book (ECB) is a mode of operation for a "block cipher," with the characteristic that each possible block of plaintext has a defined corresponding cipher text value and vice versa. In other words, the same plaintext value will always result in the same cipher text value. Electronic Code Book is used when a volume of plaintext is separated into several blocks of data, each of which is then encrypted independently of other blocks. The Electronic Code Book has the ability to support a separate encryption key for each block type.

Diffie and Hellman (DH) describe several different group methods for two parties to agree upon a shared secret in such a way that the secret will be unavailable to eavesdroppers. This secret is then converted into various types of cryptographic keys. A large number of the variants of the DH method exist including ANSI X9.42. The IETF RFC-2631, entitled "Diffie-Hellman Key Agreement Method" is incorporated here by reference.

The HyperText Transport Protocol (HTTP) Secure (HTTPs), is a standard for encrypted communications on the World Wide Web. HTTPs is actually just HTTP over a Secure Sockets Layer (SSL). For more information on HTTP, see IETF RFC-2616 incorporated herein by reference.

The SSL protocol is a protocol layer which may be placed between a reliable connection-oriented network layer protocol (e.g. TCP/IP) and the application protocol layer (e.g. HTTP). SSL provides for secure communication between a source and destination by allowing mutual authentication, the use of digital signatures for integrity, and encryption for privacy.

The SSL protocol is designed to support a range of choices for specific security methods used for cryptography, message digests, and digital signatures. The security methods are negotiated between the source and destination at the start of establishing a protocol session. The SSL 2.0 protocol specification, by Kipp E. B. Hickman, 1995 is incorporated herein by reference. More information on SSL is available at the domain name See "netscape.com/eng/security/SSL_2.html."

Transport Layer Security (TLS) provides communications privacy over the Internet. The protocol allows client/server applications to communicate over a transport layer (e.g., TCP) in a way that is designed to prevent eavesdropping, tampering, or message forgery. For more information on TLS see IETF RFC-2246, incorporated herein by reference.

In one embodiment, the security functionality includes Cisco Compatible EXtensions (CCX). CCX includes security specifications for makers of 802.11xx wireless LAN chips for ensuring compliance with Cisco's proprietary wireless security LAN protocols. As is known in the art, Cisco Systems, Inc. of San Jose, Calif. is supplier of networking hardware and software, including router and security products.

However, the present invention is not limited to such security and encryption methods described herein and more, fewer and/or other types of security and encryption methods can be used to practice the invention. The security and encryption methods described herein can also be used in various combinations and/or in different layers of the protocol stack 38 with each other.

Cloud Computing Networks

FIG. 4 is a block diagram 60 illustrating an exemplary cloud computing network 18. The cloud computing network 18 is also referred to as a "cloud communications network" 18. However, the present invention is not limited to this cloud computing model and other cloud computing models can also be used to practice the invention. The exemplary cloud communications network includes both wired and/or wireless components of public and private networks.

In one embodiment, the cloud computing network 18 includes a cloud communications network 18 comprising plural different cloud component networks 72, 74, 76, 78. "Cloud computing" is a model for enabling, on-demand network access to a shared pool of configurable computing resources (e.g., public and private networks, servers, storage, applications, and services) that are shared, rapidly provisioned and released with minimal management effort or service provider interaction.

This exemplary cloud computing model for electronic information retrieval promotes availability for shared resources and comprises: (1) cloud computing essential characteristics; (2) cloud computing service models; and (3) cloud computing deployment models. However, the present invention is not limited to this cloud computing model and other cloud computing models can also be used to practice the invention.

Exemplary cloud computing essential characteristics appear in Table 1. However, the present invention is not limited to these essential characteristics and more, fewer or other characteristics can also be used to practice the invention.

TABLE-US-00001 TABLE 1 1. On-demand BIM multi-media collaboration services. BIM multi- media collaboration services can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each network server on the cloud communications network 18. 2. Broadband network access. BIM multi-media collaboration services capabilities are available over plural broadband communications networks and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, smart phones 14, tablet computers 12, laptops, PDAs, etc.). The broadband network access includes high speed network access such as 3G and/or 4G wireless and/or wired and broadband and/or ultra-broad band (e.g., WiMAX, etc.) network access. 3. Resource pooling. BIM multi-media collaboration computing resources are pooled to serve multiple requesters using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to electronic content retrieval demand. There is location independence in that a requester of services has no control and/or knowledge over the exact location of the provided by the BIM multi-media collaboration resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or data center). Examples of pooled resources include storage, processing, memory, network bandwidth, virtual server network device and virtual target network devices. 4. Rapid elasticity. Capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale for BIM multi-media collaboration. For BIM multi- media collaboration converters, the BIM collaboration and analytic conversion capabilities available for provisioning appear to be unlimited and can be used in any quantity at any time. 5. Measured Services. Cloud computing systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of BIM multi-media collaboration services (e.g., storage, processing, bandwidth, custom electronic content retrieval applications, etc.). Electronic BIM multi- media collaboration conversion usage is monitored, controlled, and reported providing transparency for both the BIM multi-media collaboration provider and the BIM multi-media collaboration requester of the utilized electronic content storage retrieval service.

Exemplary cloud computing service models illustrated in FIG. 4 appear in Table 2. However, the present invention is not limited to these service models and more, fewer or other service models can also be used to practice the invention.

TABLE-US-00002 TABLE 2 1. Cloud Computing Software Applications 62 for a BIM multi-media collaboration service (CCSA 64). The capability to use the provider's applications 30, 30', 30'' running on a cloud infrastructure 66. The cloud computing applications 62, are accessible from the server network device 20 from various client devices 12, 14, 16 through a thin client interface such as a web browser, etc. The user does not manage or control the underlying cloud infrastructure 66 including network, servers, operating systems, storage, or even individual application 30, 30', 30'' capabilities, with the possible exception of limited user-specific application configuration settings. 2. Cloud Computing Infrastructure 66 for a BIM multi-media collaboration Service (CCI 68). The capability provided to the user is to provision processing, storage and retrieval, networks 18, 72, 74, 76, 78 and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications 30, 30', 30''. The user does not manage or control the underlying cloud infrastructure 66 but has control over operating systems, storage, deployed applications, and possibly limited control of select networking components (e.g., host firewalls, etc.). 3. Cloud Computing Platform 70 for the a BIM multi-media collaboration (CCP 71). The capability provided to the user to deploy onto the cloud infrastructure 66 created or acquired applications created using programming languages and tools supported servers 20, 22, 24, 26, etc.. The user not manage or control the underlying cloud infrastructure 66 including network, servers, operating systems, or storage, but has control over the deployed applications 30, 30', 30'' and possibly application hosting environment configurations.

Exemplary cloud computing deployment models appear in Table 3. However, the present invention is not limited to these deployment models and more, fewer or other deployment models can also be used to practice the invention.

TABLE-US-00003 TABLE 3 1. Private cloud network 72. The cloud network infrastructure is operated solely for BIM multi-media collaboration services. It may be managed by the electronic content retrieval or a third party and may exist on premise or off premise. 2. Community cloud network 74. The cloud network infrastructure is shared by several different organizations and supports a specific electronic content storage and retrieval community that has shared concerns (e.g., mission, security requirements, policy, compliance considerations, etc.). It may be managed by the different organizations or a third party and may exist on premise or off premise. 3. Public cloud network 76. The cloud network infrastructure such as the Internet, PSTN, SATV, CATV, Internet TV, etc. is made available to the general public or a large industry group and is owned by one or more organizations selling cloud services. 4. Hybrid cloud network 78. The cloud network infrastructure 66 is a composition of two and/or more cloud networks 18 (e.g., private 72, community 74, and/or public 76, etc.) and/or other types of public and/or private networks (e.g., intranets, etc.) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load-balancing between clouds, etc.)

Cloud software 64 for electronic content retrieval takes full advantage of the cloud paradigm by being service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability for electronic content retrieval. However, cloud software services 64 can include various states.

Cloud storage of desired electronic content on a cloud computing network includes agility, scalability, elasticity and multi-tenancy. Although a storage foundation may be comprised of block storage or file storage such as that exists on conventional networks, cloud storage is typically exposed to requesters of desired electronic content as cloud objects.

In one exemplary embodiment, the cloud application 30, 30', 30'', offers cloud services for BIM collaboration and analytic conversion. The application 30, 30', 30'' offers the cloud computing Infrastructure 66, 68 as a Service 62 (IaaS), including a cloud software infrastructure service 62, the cloud Platform 70, 71 as a Service 62 (PaaS) including a cloud software platform service 62 and/or offers Specific cloud software services as a Service 62 (SaaS) including a specific cloud software service 62 for 2D to 3D data modeling conversion. The IaaS, PaaS and SaaS include one or more of cloud services 62 comprising networking, storage, server network device, virtualization, operating system, middleware, run-time, data and/or application services, or plural combinations thereof, on the cloud communications network 18.

FIG. 5 is a block diagram 80 illustrating an exemplary cloud storage object 82.

The cloud storage object 82 includes an envelope portion 84, with a header portion 86, and a body portion 88. However, the present invention is not limited to such a cloud storage object 82 and other cloud storage objects and other cloud storage objects with more, fewer or other portions can also be used to practice the invention.

The envelope portion 84 uses unique namespace Uniform Resource Identifiers (URIs) and/or Uniform Resource Names (URNs), and/or Uniform Resource Locators (URLs) unique across the cloud communications network 18 to uniquely specify, location and version information and encoding rules used by the cloud storage object 82 across the whole cloud communications network 18. For more information, see IETF RFC-3305, Uniform Resource Identifiers (URIs), URLs, and Uniform Resource Names (URNs), the contents of which are incorporated by reference.

The envelope portion 84 of the cloud storage object 82 is followed by a header portion 86. The header portion 86 includes extended information about the cloud storage objects such as authorization and/or transaction information, etc.

The body portion 88 includes methods 90 (i.e., a sequence of instructions, etc.) for using embedded application-specific data in data elements 92. The body portion 88 typically includes only one portion of plural portions of application-specific data 92 and independent data 94 so the cloud storage object 82 can provide distributed, redundant fault tolerant, security and privacy features described herein.

Cloud storage objects 82 have proven experimentally to be a highly scalable, available and reliable layer of abstraction that also minimizes the limitations of common file systems. Cloud storage objects 82 also provide low latency and low storage and transmission costs.

Cloud storage objects 82 are comprised of many distributed resources, but function as a single storage object, are highly fault tolerant through redundancy and provide distribution of desired electronic content across public communication networks 76, and one or more private networks 72, community networks 74 and hybrid networks 78 of the cloud communications network 18. Cloud storage objects 82 are also highly durable because of creation of copies of portions of desired electronic content across such networks 72, 74, 76, 78 of the cloud communications network 18. Cloud storage objects 82 includes one or more portions of desired electronic content and can be stored on any of the 72, 74, 76, 78 networks of the cloud communications network 18. Cloud storage objects 82 are transparent to a requester of desired electronic content and are managed by cloud applications 30, 30', 30''.

In one embodiment, cloud storage objects 82 are configurable arbitrary objects with a size up to hundreds of terabytes, each accompanied by with a few kilobytes of metadata. Cloud objects are organized into and identified by a unique identifier unique across the whole cloud communications network 18. However, the present invention is not limited to the cloud storage objects described, and more fewer and other types of cloud storage objects can be used to practice the invention.

Cloud storage objects 82 present a single unified namespace or object-space and manages desired electronic content by user or administrator-defined policies storage and retrieval policies. Cloud storage objects includes Representational state transfer (REST), Simple Object Access Protocol (SOAP), Lightweight Directory Access Protocol (LDAP) and/or Application Programming Interface (API) objects and/or other types of cloud storage objects. However, the present invention is not limited to the cloud storage objects described, and more fewer and other types of cloud storage objects can be used to practice the invention.

REST is a protocol specification that characterizes and constrains macro-interactions storage objects of the four components of a cloud communications network 18, namely origin servers, gateways, proxies and clients, without imposing limitations on the individual participants.

SOAP is a protocol specification for exchanging structured information in the implementation of cloud services with storage objects. SOAP has at least three major characteristics: (1) Extensibility (including security/encryption, routing, etc.); (2) Neutrality (SOAP can be used over any transport protocol such as HTTP, SMTP or even TCP, etc.), and (3) Independence (SOAP allows for almost any programming model to be used, etc.)

LDAP is a software protocol for enabling storage and retrieval of electronic content and other resources such as files and devices on the cloud communications network 18. LDAP is a "lightweight" version of Directory Access Protocol (DAP), which is part of X.500, a standard for directory services in a network. LDAP may be used with X.509 security and other security methods for secure storage and retrieval. X.509 is public key digital certificate standard developed as part of the X.500 directory specification. X.509 is used for secure management and distribution of digitally signed certificates across networks.

An API is a particular set of rules and specifications that software programs can follow to communicate with each other. It serves as an interface between different software programs and facilitates their interaction.

Bar Codes

A "barcode" is an optical machine-readable representation of data, which shows data about the object to which it attaches. Originally, barcodes represented data by varying the widths and spacing of parallel lines, and may be referred to as linear or 1 dimensional (1D). Later they evolved into rectangles, dots, hexagons and other geometric patterns in 2 dimensions (2D). Although 2D systems use a variety of symbols, they are generally referred to as barcodes as well. Barcodes originally were scanned by special--optical scanners called barcode readers, scanners and interpretive software are available on devices including desktop printers (not illustrated) and smart phones 14 and tablet computers 12.

Table 4 illustrates exemplary linear barcodes, the standards of all of which are incorporated by reference. However, the present invention is not limited to the exemplary linear barcodes listed in Table 4, and more fewer and other linear barcodes can also be used to practice the invention.

TABLE-US-00004 TABLE 4 Linear Bar Codes U.P.C. Codabar Code 25-Non-interleaved 2 of 5 Code 25-Interleaved 2 of 5 Code 39 Code 93 Code 128 Code 128A Code 128B Code 128C Code 11 CPC Binary DUN 14 EAN 2 EAN 5 EAN 8, EAN 13 Facing Identification Mark GS1-128 (formerly known as UCC/EAN-128), incorrectly referenced as EAN 128 and UCC 128 GS1 DataBar, formerly Reduced Space Symbology (RSS) HIBC (HIBCC Health Industry Bar Code) ITF-14 Latent image barcode Pharmacode Plessey PLANET POSTNET Intelligent Mail barcode MSI PostBar RM4SCC/KIX JAN Telepen

Table 5 illustrates exemplary matrix (2D) barcodes, the standards of all of which are incorporated by reference. However, the present invention is not limited to the exemplary matrix barcodes listed in Table 5, and additional and other matrix barcodes can also be used to practice the invention.

TABLE-US-00005 TABLE 5 Matrix Bar Codes 3-DI ArrayTag Aztec Code Small Aztec Code Chromatic Alphabet Codablock Code 1 Code 16K Code 49 ColorCode Compact Matrix Code CP Code CyberCode d-touch DataGlyphs Datamatrix Datastrip Code Dot Code A EZcode Grid Matrix Code High Capacity Color Barcode HueCode INTACTA.CODE InterCode JAGTAG Lorem ipsum MaxiCode mCode MiniCode MicroPDF417 MMCC Nintendo e-Reader#Dot code Optar PaperDisk PDF417 PDMark QR Code QuickMark Code SmartCode Snowflake Code ShotCode SPARQC ode SuperCod Trillcode UltraCode UnisCode VeriCode, VSCode WaterCode

In one specific embodiment, the application 30, 30', 30'' interacts with a bar code reader application for 2D to 3D data modeling conversion. However, the present invention is not limited to a bar code reader application and other applications can also be used to practice the invention.

In one specific exemplary embodiment, a QR bar code is used for 2D to 3D data modeling conversion. However, the present invention is not limited to QR codes and other types of bar codes can also be used to practice the invention

FIG. 6 is a block diagram 96 illustrating display of an exemplary QR bar code 98. The QR bar code 98 in FIG. 6 is a valid QR bar code generated and including the text "This is a QR for a blue window 60 inches by 30 inches."

A "QR Code" is a specific matrix barcode (or two-dimensional code), readable by dedicated QR barcode readers and camera phones. The code consists of black modules arranged in a square pattern on a white background. The information encoded can be text, URL or other data. QR codes are defined in ISO/IEC 18004:2006 Information technology--Automatic identification and data capture techniques--QR Code 2005 bar code symbology specification, 1 Sep. 2006, the contents of which are incorporated by reference.

Users with a camera equipped smart phone 14 (or tablet computer 12, etc.) with a the camera component 100, a bar code reader application appropriate for the bar code processes the digital image of the QR Code can include the QR Code 98. The camera component 100 is used to capture existing QR codes from print and electronic documents 102 and other sources (e.g., from other network devices, etc.)

QR codes 98 are also used to display text, contact information, connect to a wireless network, open a web page in the phone's browser, download music, communicate a social event or coupon, or initiate a communications event over the cloud communications network 18 (e.g., voice call, data call, etc.) This act of linking from physical world objects is known as a "hardlink" or "physical world hyperlinks."

For example, Google's smart phone Android operating system supports the use of QR codes by natively including the barcode scanner (e.g., Zxing, etc.) on some models, and the browser supports Uniform Resource Identifier (URI) redirection, which allows QR Codes to send metadata to existing applications on the device. Nokia's Symbian operating system is also provided with a barcode scanner, which is able to read QR codes, while mbarcode is a QR code reader for the Maemo operating system. In the Apple iOS, a QR code reader is not natively included, but hundreds of free applications available with reader and metadata browser URI redirection capability. However, the present invention is not limited to these network device operating systems and other bar code readers and device operating systems can also be used to practice the invention.

In one embodiment, a user may scan a number of QR bar codes 98 from non-electronic information such as magazines, business cards, billboards, other non-electronic advertising, etc. A user may also scan a number of QR bar codes 98 from electronic advertising such from web-sites, other target network devices 12, 14, 16, from e-mails, text messages, instant messages, etc.

Wearable Devices

Wearable technology" and/or "wearable devices" are clothing and accessories incorporating computer and advanced electronic technologies. Wearable network devices provide several advantages including, but not limited to: (1) Quicker access to notifications. Important and/or summary notifications are sent to alert a user to view the whole message. (2) Heads-up information. Digital eye wear allows users to display relevant information like directions without having to constantly glance down; (3) Always-on Searches. Wearable devices provide always-on, hands-free searches; and (4) Recorded data and feedback, Wearable devices take telemetric data recordings and providing useful feedback for users for exercise, health, fitness, etc. activities.

FIG. 7 is a block diagram with 104 illustrating wearable devices. The wearable devices include one or more processors and include, but are not limited to, wearable digital glasses 106, clothing 108 (e.g., smart ties 108', etc.), jewelry 110 (e.g., smart rings, smart earrings, etc.) and/or watches 112. The wearable devices are also wearable by animals (e.g., service dogs, pets, show animals, circus animals, etc.). However, the present invention is not limited to such embodiments and more, fewer and other types of wearable devices can also be used to practice the invention.

In one specific embodiment, the application 30, 30', 30'' interacts with wearable devices 106-112 for 2D to 3D data modeling conversion the methods described herein However, the present invention is not limited this embodiment and other embodiments can also be used to practice the invention.

Unmanned Aerial Vehicles (UAVS)--Drones

An unmanned aerial vehicle (UAV) 27 (FIG. 1), commonly known as a "drone" and also referred to as a "Remotely Piloted Aircraft (RPA)" by the International Civil Aviation Organization (ICAO), is an aircraft without a human pilot aboard. There are different kind of drones 123 including: (1) UAS (Unmanned Air System); (2) UAV (Unmanned Aerial Vehicle); (3) RPAS (Remote Piloted Aircraft Systems) and (4) Model Aircraft. Its flight is controlled either autonomously by onboard computers or by the remote control of a pilot on the ground or in another vehicle. The typical launch and recovery method of an unmanned aircraft is by the function of an automatic system or an external operator on the ground. Historically, UAVs 27 were simple remotely piloted aircraft, but autonomous control is increasingly being employed.

The use of UAVs 27 are characterized by altitudes of flights. The following types of UAVs 27 fly at different altitudes, broadly characterized as: (1) Very high altitude (VHA): above 45,000 feet (more than 12 km); (2) High altitude (HA): from 20,000 to 45,000 feet (6 to 12 km); (3) Medium altitude (MA): from 10 to 20,000 feet (3 to 6 km); or (4) Low altitude (LA): between a few hundred and up to 10,000 feet (1 to 3 km).

The specific needs of UAV 27 include required UAV capabilities to allow them to fly in "non-segregated" air-traffic controlled airspace. The requirements placed on mobile links to and from a UAV 27 are required in terms of aeronautical safety due to the fact that these vehicles are unmanned. An air-traffic control (ATC) link includes full automation of communications between on-board and ground systems. A remote pilot (RP) link places additional and more strenuous constraints on the radio communication bearer(s) and systems used in, not necessarily significant as regards the amount of volume of data to be exchanged, in as much as UAV 27 generally possess or will possess their own computerized autonomous flight management system, limiting the remote pilot (RP) interventions to that of supervising and/or re-establishing flight procedures or choosing the most appropriate one, should any contingency arise.

The UAV 27 communicates on Aeronautical Mobile Service (AMS) wireless frequency including, but not limited to: (a) 4400-4940 MHz; (b) 5030 (or 5010)-5090 MHz, (MLS "core" band; (c) 5090-5150 MHz ("MLS" extension band); (d) 5150-5250 MHz; (e) 5925-6700 MHz; (f) 22.5-23.6 GHz; (g) 24.75-25.5 GHz; or (h) 27-27.5 GHz.

Most UAVs 27 have cameras, microphones and other audiovisual equipment that are used to view and collect information objects of interest from the air. The audiovisual signals are typically sent from the UAV's 27 to a remote control center for viewing by an operator.

In one specific embodiment, the application 30, 30', 30'' interacts with UAVs including an application 30 for 2D to 3D data modeling conversion with the methods described herein. However, the present invention is not limited to a bar code reader application and other applications can also be used to practice the invention.

Unmanned Ground Vehicle (UGV)

An unmanned ground vehicle (UGV) 29, 29' is a vehicle that, operates while in contact, with the ground and without an onboard human presence. UGVs 29, 29' are used for many applications where it may be inconvenient, dangerous, or impossible to have a human operator present such as construction sites, etc. Generally, the UGV 29, 29' will have a set of sensors to observe the environment, and will either autonomously make decisions about its behavior or pass the information to a human operator at a different location who will control the vehicle through teleoperation. In one embodiment, the UM % 29, 29' is autonomous.

An "autonomous" UGM 29, 29' is an autonomous robot that, operates without the need for a human controller. The vehicle uses its sensors to develop some limited understanding of the environment, which is then used by control algorithms to determine the next action to take in the context; of a human provided mission goal. This fully eliminates the need for any human to watch over the menial tasks (e.g., checking a punch list, etc.) that the UGV 29, 29' is completing. In such an embodiment, the autonomous UGV 29, 29' may read RFID tags 99 place on selected portions of building components (e.g., FIG. 20, etc.).

Creating Three Dimensional (3D) Objects from Two Dimensional (2D) Data for Building Information Modeling (BIM)

FIGS. 8A and 8B are a flow diagram illustrating a Method 114 for creating three dimensional (3D) objects from two dimensional (2D) data. In FIG. 8A at Step 116, a first server application on a first server network device with one or more processors receives two-dimensional (2D) electronic data for a specific type of three-dimensional (3D) modeling object for a selected type of 3D modeling program on a target application on target network device with one or more processors via a communications network. At Step 118, the first server application selects a blank generic 3D object model template for the specific type of 3D modeling object included in received 2D electronic data. At Step 120, the first server application creates a preliminary specific 3D object model in the selected blank generic 3D object model template in a selected mark-up language with the received 2D electronic data. At Step 122, the first server application converts the created preliminary 3D object model in the selected mark-up language to a first data format in a first data file. At Step 124, the first server application sends to a library application on the server network device the first data file and a final type of 3D modeling object format for the selected type of 3D modeling program. At Step 126, the library application converts the created preliminary 3D object model into a final type of 3D modeling object for the selected type of 3D modeling program. At Step 128, the library application on the server network device sends to the target application on the target network device via the communications network the final type of 3D model object for the selected type of 3D modeling program in a second data format in a second data file.

Method 114 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment in FIG. 8A at Step 116, a first server application 30, 30', 30'' on a first server network device 24 with one or more processors receives two-dimensional (2D) electronic data 13, 15 for a specific type of three-dimensional (3D) modeling object for a selected type of 3D modeling program on a target application 30 on target network device 12, 14, 16 with one or more processors via a communications network 18.

In one embodiment, the two-dimensional (2D) electronic data includes Hyper Text Markup Language (HTML) data and/or QR bar code 98 data and/or RFID tag 99 data (FIG. 6). In another embodiment, the 2D electronic data may include an object specification file such as PDF file, text file or other file for any product from any manufacturer. In another embodiment the 2D electronic data includes information for new and/or non-existing products that may be produced or manufactured. In another embodiment the 2D electronic data virtual products that will only be used in a simulation model. In another embodiment, the 2D electronic data may include product information from a web-page on an e-commerce site, etc. However, the present invention is not limited to such and embodiment and other types of 2D electronic data can be used to practice the invention.

In one embodiment the selected type of 3D modeling program includes a Building Information Modeling (BIM) modeling program. In one specific embodiment, the BIM modeling program includes an AUTODESK REVIT program and/or an AUTOCAD and/or a VECTORWORKS program. However, the present invention is not limited to such an embodiment and other 3D modeling programs can be used to practice the invention.

At Step 118, the first server application 30, 30', 30'' selects a blank generic 3D object model template for the specific type of 3D modeling object included in received 2D electronic data 13, 15.

In one embodiment, the blank generic 3D object model template is a template for a specific type of 3D modeling object used by architects, builders, engineers, interior designer, scientists, etc. For examples, the generic 3D object template used by an architect, builder, etc. may be for door, window, beam, truss, etc. The specific type of 3D object may be a pre-hung door eight feet high, three feet wide, solid core, made of maple wood, stained light brown in color, etc. The generic type of 3D object templates used by engineers may be pipeline components, components of automobiles, trucks, boats, electronic device components, components for multi-layered boards, etc. The generic type of 3D object templates used by interior designers may be furniture, wall hangings, works of art, statutes, etc. However, the present invention is not limited to such embodiments and other blank generic 3D object templates can be used to practice the invention.

In one embodiment, the blank generic 3D object templates are used to dynamically create a final specific type of 3D object. This offers several advantages over the prior art in which hundreds to thousands of final specific types of 3D objects are created and stored in libraries. For example, if a user desired a 3D model of a pre-hung door eight feet high, three feet wide, solid core, made of maple wood, stained light brown in color, such a 3D model could be created. If the dimensions or the stain color, etc. of the door were changed, additional 3D models would be created and stored in the library.

In direct contrast, in the present invention, since blank generic 3D object templates are used, anytime received 2D electronic data is changed, a new final specific type of 3D object is dynamically generated and used. Such an embodiment only requires a small number (e.g., about 50-60, etc.) blank generic 3D object templates (e.g., door, window, beam, desk, chair, etc.). Such an embodiment requires less data storage, requires less processing power and an provides ultimate flexibility as a new final specific type of 3D object can be generated for virtually any type of 2D electronic data.

In one embodiment, the new final specific type of 3D object models created from the blank generic 3D object templates can be stored in a database or other non-volatile storage for re-use. In one embodiment, such new final specific type of 3D object models required less storage space than other types of native 3D object models used for the specific type of modeling program. However, the present invention is not limited to such embodiments and other embodiments that do not store the new final specific type of 3D object models can be used to practice the invention.

In one embodiment, the 2D electronic information further includes physical location information including, but not limited to, Global Positioning System (GPS) information, street address information, two-dimensional (2D) geo-space (e.g., X,Y) (e.g., building, floor), three-dimensional (3D) (X, Y, Z) (e.g., building, floor, floor location (e.g., room, office, desk, etc.)) or other physical location information (e.g., longitude, latitude, street address, etc.). However, the present invention is not limited to such physical location information and other physical location information can be used to practice the invention.

The Global Positioning System (GPS) is a space-based global navigation satellite system (GNSS) that provides reliable location and time information in all weather and at all times and anywhere on or near the Earth. A GPS receiver calculates its position by precisely timing signals sent by GPS satellites. A GPS receiver uses the messages it receives to determine a transit time of each message and computes a distance to each GPS satellite. These distances along with the satellites' locations are used with the possible aid of triangulation, depending on which algorithm is used, to compute a current physical position of the GPS receiver. This position is then displayed, perhaps with a moving map display (e.g., at a street level, etc.) and/or latitude and longitude and/or elevation and/or speed and/or acceleration information may also be included. Many GPS units also show derived information such as travel direction and speed, calculated from position changes. The GPS coordinates include standard GPS, GPS map, Digital GPS (DGPS) and/or other types of GPS information.

Returning to FIG. 8A at Step 120, the first server application 30, 30', 30'' creates a preliminary specific 3D object model in the selected blank generic 3D object template in a selected mark-up language with the received 2D electronic data. In one exemplary embodiment, the XML mark-up language is used. However, the present invention is not limited to such an embodiment and other mark-up and/or other non-mark-up languages can be used to practice the invention. In such an embodiment, XML is used to create an XML schema that specifically layouts the received 2D electronic data into a specific data format that is used by the library application.

An "XML schema" is a description of a type of XML document, typically expressed in terms of constraints on the structure and content of documents of that type, above and beyond the basic syntactical constraints imposed by XML itself. These constraints are generally expressed using some combination of grammatical rules governing the order of elements, Boolean predicates that the content must satisfy, data types governing the content of elements and attributes, and more specialized rules such as uniqueness and referential integrity constraints.

There are languages developed specifically to express XML schemas. The Document Type Definition (DTD) language, which is native to the XML specification, is a schema language that is of relatively limited capability, but that also has other uses in XML aside from the expression of schemas.

At Step 122, the first server application 30', 30'' converts the created preliminary 3D object in the XML schema to a text data format in a data file with a .TXT extension. A text file is a kind of data file that is structured as a sequence of lines of electronic text. A text file exists within a file system on a network device. "Text file" refers to a type of container, while "plain text" refers to a type of content. Text files can contain plain text, but they are not limited to such. At a generic level of description, there are two kinds of computer files: text files and binary files.

Text files are used to avoid some of the problems encountered with other file formats, such as endianness (i.e., byte ordering), padding bytes, or differences in the number of bytes in a machine word. Further, when data corruption occurs in a text file, it is often easier to recover and continue processing the remaining contents. When opened by a text editor, human-readable content is presented to the user with text.

In one embodiment, text files are used to allow the 3D models to be human-readable. However, the present invention is not limited to such embodiments and other types of human-readable and/or machine readable file formats and file types can be used to practice the invention. Such text files are also required by many 3D modeling programs (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, MICROSTATION, ARCHICAD, etc.).

In another embodiment, the At Step 122, the first server application 30', 30'' converts the created preliminary 3D object in the XML schema to a text data format in a data file with a .DOC, .RTF and/or .PDF extension.

A Document (DOC) file is created in a version of MICROSOFT WORD's processing application prior to MICROSOFT OFFICE. A .DOC files use a .DOC extension and differ from text files (.TXT extension) because they contain proprietary codes that must be opened in WORD or software that reads the WORD. format.

A Rich Text Format (RTF) file is standard formalized by MICROSOFT Corporation for specifying formatting of documents. RTF files are actually ASCII files with special commands to indicate formatting information, such as fonts and margins.

A Portable Document Format (PDF) is a file format standardized by ADOBE SYSTEMS that is used to represent documents in a manner independent of application software, hardware, and operating system. Each PDF file encapsulates a complete description of a fixed-layout flat document, including the text, fonts, graphics, and other information needed to display it.

However, the present invention is not limited to these embodiments and other files types, layouts and file extensions can be used to practice the invention.

In FIG. 8B at Step 124, the first server application 30', 30'' sends to a library application 31 on the server network device 24 the text file and a final type of 3D modeling object format for the selected type of 3D modeling program (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, MICROSTATION, ARCHICAD, etc.).

In one embodiment, the library application is a Dynamic Link Library (DLL) application. However, the present invention is not limited to such an embodiment and other types of library applications can be used to practice the invention.

A Dynamic-link library (DLL), is Microsoft's implementation of the shared library concept in the MICROSOFT WINDOWS and OS/2 operating systems. These libraries usually have the file extension .DLL, .OCX (for libraries containing ActiveX controls), or .DRV (for legacy system drivers). The file formats for DLLs are the same as for WINDOWS EXE files--that is, Portable Executable (PE) for 32-bit and 64-bit Windows, and New Executable (NE) for 16-bit WINDOWS. As with EXEs, DLLs can contain code, data, and resources, in any combination.

Data files with the same file format as a DLL, but with different file extensions and possibly containing only resource sections, can be called resource DLLs. Examples of such DLLs include icon libraries, sometimes having the extension .ICL, and font files, having the extensions .FON and .FOT.

In another embodiment, the library application 31 includes Dynamic Library Loading (DLL). This is a mechanism by which a computer program can, at run time, dynamically load a library (or other binary) into memory, retrieve the addresses of functions and variables contained in the library, execute those functions or access those variables, and unload the library from memory. Unlike static linking and loadtime linking, this mechanism allows a computer program to startup in the absence of these libraries, to discover available libraries, and to potentially gain additional functionality. However, the present invention is not limited to such an embodiment and other types of library applications can be used to practice the invention.

At Step 126, library application 31 converts the created preliminary 3D object model into a final type of 3D modeling object 13', 15' for the selected type of 3D modeling program (e.g., AUTODESK REVIT, AUTODESK INVENTOR, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.). In one embodiment, it creates a four dimensional (4D) or higher dimensional modeling object. However, the present invention is not limited to this embodiment and other embodiments may be used to practice the invention.

A 4D BIM is a term widely used in the CAD industry, refers to the intelligent linking of individual 3D CAD components or assemblies with time or schedule-related information. The use of the term 4D is intended to refer to the fourth dimension: time (i.e., 4D is 3D+schedule (time)).

The construction of the 4D models enables the various participants (e.g., from architects, designers, contractors to owners) of a construction project, to visualize the entire duration of a series of events and display the progress of construction activities through the lifetime of the project. This BIM-centric approach towards project management technique has a very high potential to improve the project management and delivery of construction project, of any size or complexity.

A 5D BIM is also term widely used in the CAD industry, which refers to the intelligent linking of individual 3D CAD components or assemblies with schedule (time, 4D) constraint and cost-related information (5D). The use of the term 5D is intended to refer to the addition of 4D: time and 5D: cost to the 3D model (i.e., 5D is 3D+schedule (4D time)+cost).

The construction of the 5D models enables the various participants (e.g., from architects, designers, contractors to owners) of any construction project, to visualize the progress of construction activities and its related costs over time.

AUTODESK REVIT, is Building Information Modeling (BIM) software for architects, structural engineers, MEP engineers, designers and contractors. It allows users to design a building and structure and its components in 3D, annotate the model with 2D drafting elements and access building information from the building models database. REVIT is 4D BIM capable with tools to plan and track various stages in the building's lifecycle, from concept to construction and later demolition.

AUTODESK INVENTOR, is a 3D mechanical CAD design software for creating 3D digital prototypes used in the design, visualization and simulation of products.

AUTOCAD is a software application for 2D and 3D computer-aided design (CAD) and drafting. It has been available since 1982 as a desktop application and since 2010 as a mobile web- and cloud-based application, currently marketed as AUTOCAD 360.

VECTORWORKS is a computer-aided design (CAD) and Building Information Modeling (BIM) software program developed by Nemetschek. VECTORWORKS that is used for drafting, technical drawing and 3D modeling. VECTORWORKS offers 2D, 3D, production management, and presentation capabilities for all phases of the design process.

BENTLEY SYSTEM, INC, is a software company that produces solutions for the design, construction and operation of infrastructure. The company's software serves the building, plant, civil, and geospatial markets in the areas of architecture, engineering, construction (AEC) and operations. Their software solutions are used to design, engineer, build, and operate large constructed assets such as roadways, railways, bridges, buildings, industrial and power plants and utility networks.

BENTLEY'S principal software solution is MICROSTATION. MIRCOSTATION is a desktop 2D/3D CAD platform upon which BENTLEY and third-party software companies build more specific solutions. For example, BENTLY MAP is an extension from BENTLEY that runs on top of MICROSTATION adding Graphic Information System (GIS) and spatial capabilities to the CAD program.

The latest versions of MICROSTATION are released solely for MICROSOFT WINDOWS operating systems, but historically MICROSTATION was available for APPLE MACINTOSH platforms and a number of UNIX-like operating systems. MICROSTATION is the platform architectural and engineering software package developed by BENTLEY SYSTEMS, Incorporated. Among a number of things, it generates 2D/3D vector graphic objects and elements.

BENTLEY is also a provider of Building information modeling (BIM) solutions for the Architecture, Structural, Mechanical and Electrical engineering disciplines. BENTLEY also provides GENERATIVE COMPONENTS, a parametric modeling solution used primarily by architects and engineers in building design.

ARCHICAD is an architectural BIM CAD software for APPLE MACINTOSH and Microsoft Windows developed by the Hungarian company GRAPHISOFT. ARCHICAD offers specialized solutions for handling all common aspects of aesthetics and engineering during the whole design process of the built environment--buildings, interiors, urban areas, etc.

Development of ARCHICAD started in 1982 for the original APPLE MACINTOSH. ARCHICAD is recognized as the first CAD product on a personal computer able to create both 2D drawings and parametric 3D geometry..sup.[1]In its debut in 1987 ARCHICAD also became the first implementation of BIM under GRAPHISOFT's "Virtual Building" concept.

SOLIDWORKS is a 3D mechanical CAD program that runs on MICROSOFT WINDOWS and is developed by DASSAULT SYSTEMES SOLIDWORKS CORP., a subsidiary of DASSAULT SYSTEMES, S. A. (Velizy, France).

TC CREO, formerly known as PRO/ENGINEER or PRO/E is a parametric, integrated 3D CAD/CAM/CAE solution created by PARAMETRIC TECHNOLOGY CORPORATION (PTC). It was the first to market with parametric, feature-based, associative solid modeling software. The application runs on MICROSOFT WINDOWS platform, and provides solid modeling, assembly modeling and drafting, finite element analysis, Direct and Parametric modeling, Sub-divisional and nurbs surfacing and NC and tooling functionality for mechanical engineers.

SKETCHUP (Formerly: GOOGLE SKETCHUP) is a 3D modeling program for applications such as architectural, interior design, civil and mechanical engineering, film, and video game design. A freeware version, SKETCHUP MAKE, and a paid version with additional functionality, SKETCHUP PRO, are available.

There is an online Open source repository of free-of-charge model assemblies (e.g., windows, doors, automobiles, etc.), 3D WAREHOUSE, to which users may contribute models. The program includes drawing layout functionality, allows surface rendering in variable "styles", supports third-party "plug-in" programs hosted on a site called Extension Warehouse to provide other capabilities (e.g., near photo-realistic rendering), and enables placement of its models within GOOGLE EARTH.

SKETCHUP can export 3D to Digital Asset Exchange, .DAE and GOOGLE EARTH's Keyhole Markup Language, .KMZ file format. The Pro version extends exporting support to include the AUTOCAD 3D STUDIO DOS, .3DS, AUTOCAD DRAWING, .DWG, AutoCAD DXF (Drawing Interchange Format, or Drawing Exchange Format), .DFX, KAYDARA Filmbox, .FBX, Object geometry definition, .OBJ, AUTODESK Softimage, .XSK, and Virtual Reality Modeling Language .WRL, file formats. GOOGLE SKETCHUP can also save elevations or renderings of the model, called "screenshots", as Bitmap, .BMP, Portable Network Graphics, .PNG, JPEG, .JPG, Tagged Image File Format, .TIF, with the Pro version also supporting Portable Document Format, .PDF, Encapsulated Postscript, .EPS and .EPX, Drawing, .DWG, and AUTOCAD Drawing Exchange format, .DXF.

At Step 128, library application 31 on the server network device 24 sends to the target application 30 on the target network device 12, 14, 16 via the communications network 18 the final type of 3D model object 13', 15' for the selected type of 3D modeling program (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS etc.) in a second data format in a second data file.

In such an embodiment, the second data format and the second data files include an AUTODESK REVIT 3D modeling object in an AUTODESK REVIT formatted file with a .RFA file extension and/or .RVT file extension and/or an AUTOCAD 3D modeling object in an AUTOCAD formatted file with a .DWG file extension.

In another embodiment, the second data format and the second data files include files used by VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS and PROE programs.

In another embodiment, the second data format and the second data files includes text and document files with a .TXT, .DOC, .RTF and/or a .PDF extensions. Such second data files may also include product data sheets and other types of informational files.

However, the present invention is not limited to such embodiments and other 3D/4D/5D modeling objects and file formats can be used to practice the invention.

An AUTODESK REVIT Family File is stored in an RFA format and is affixed with a .RFA extension. These RFA files are generally classified as data files that include one or more 3D models that can be imported into a three dimensional scene and were created and saved using the Revit Family Editor. RFA files contain BIM (Building Information Modeling) data and require Autodesk Revit software. These files are also known as an AUTODESK REVIT file. The AUTODESK REVIT software is used by architects and engineers to design and model. The REVIT model is based on a compilation of items called "families." The compiled items refer to the parametric objects such as 3D building objects and two dimensional drafting objects.

RVT file formatted files with a .RVT extension are data files primarily associated with VERSAPRO Reference View Table. RVT files are also associated with AUTODESK REVIT Design Setup File, INCITE Media Assistant File, IEX Workforce Management Report, APACHE RIVET Tcl File and FILEVIEWPRO.

DWG file formatted files (DraWinG) with a .DWG extension are a binary file format used for storing two and three dimensional design data and metadata. It is the native format for several CAD packages including DRAFTSIGHT, AUTOCAD, INTELLICAD (and its variants) and CADDIE. In addition, DWG is supported non-natively by many other CAD applications. The .BAK (drawing backup), .DWS (drawing standards), .DWT (drawing template) and .SV$ (temporary automatic save) files are also DWG files.

MCD and/or VWX formatted files with a .MCD and/or a .VWX extension are file formats used for storing two and three dimensional design data and metadata for VECTORWORKS.

In addition, using the present invention two or more companies will facilitate work process interoperability between their applications through supporting the reciprocal use of available Application Programming Interfaces (APIs) and the new 3D models created herein.

In one embodiment, the final type of 3D model object 13', 15' for the selected type of 3D modeling program is automatically installed in the target network device. In another embodiment, the final type of 3D model object 13', 15' for the selected type of 3D modeling program is manually installed on the target network device. However, the present invention is not limited to such embodiments, and other embodiments can be used to practice the invention.

In one embodiment at Step 128, the final type of 3D model object 13', 15' for the selected type of 3D modeling program is sent to another server network device (e.g., 22, etc.) via the communications network instead of back to the target network device 12, 14, 16. In such an embodiment a user may request a new 3D model of an object and the new 3D model may then be sent to and installed on a company system of the user to be used and re-used by other persons in the company. However, the present invention is not limited to such embodiments, and other embodiments can be used to practice the invention.

In one embodiment, Step 128 further includes sending the text file that was created at Step 124 as some existing 3D modeling programs can utilize the text file created. However, the present invention is not limited to such embodiments, and other embodiments can be used to practice the invention.

In another embodiment, Step 128 further includes sending additional text files, .DOC files, .RTF files and/or .PDF files specifically created to include additional 3D or higher dimensional modeling information and/or product and/or project information. However, the present invention is not limited to such embodiments, and other embodiments can be used to practice the invention.

In one embodiment, Method 114 is used to model every item used in building a structure or vehicle from the fasteners (e.g., nails, screws, bolts, etc.) to the paint, to the structural components etc.

In one embodiment, Method 114 is used to provide seamless extensions and additions to existing 3D modeling programs (e.g., AUTODESK REVIT, AUTODESK INVENTOR, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.) that are very useful to users of such programs.

In one embodiment, Method 114 is used to create new 2D and/or 3D and/or 4D and/or 5D model objects from received 2D data. In such an embodiment, the steps of Method 114 are practiced with blank generic 2D object model templates and/or blank generic 4D object model templates. In such embodiments, the final type of 2D and/or 4D and/or 5D model object for the selected type of 3D modeling program is returned to the target network device.

However, the present invention is not limited to 2D, 3D 4D and/or 5D model objects and higher dimension model objects can also be used to practice the invention.

However, the present invention is not limited to such embodiments, and other embodiments can be used to practice the invention.

In one exemplary embodiment, Method 114 further includes the steps illustrated in FIG. 9 130. However, the present invention is not limited to these additional steps and the invention can be practiced with and/or without the additional steps in FIG. 9.

FIG. 9 is a flow diagram illustrating a Method 130 for creating three dimensional (3D) objects from two dimensional (2D) data.

In FIG. 9 at Step 132, the library application 31 on the server network device 24 sends the final type of 3D object model 13' via the communications network 18 to a manufacturing application 33 on manufacturing server network device 26 with one or more processors via the communications network 18. The manufacturing server network device includes a robot 41, 3D printer 39, or manufacturing machine 35. At Step 134, automatically manufacturing from manufacturing server network device 26, an actual physical 3D object 37 (e.g., door 37, FIG. 1, etc.) from the final type of 3D object model 13', 15'.

Creating Three Dimensional (3D) Objects from Two Dimensional (2D) Data for Building Information Modeling (BIM) with Cloud Computing

The present invention also provides the creation of 3D modeling objects 13,' 15' on a cloud communications network 18.

FIGS. 10A and 10B are a flow diagram illustrating a Method 136 creating three dimensional (3D) objects from two dimensional (2D) data with cloud computing. In FIG. 10A at Step 138, a cloud server application on cloud network device with one or more processors receives two-dimensional (2D) electronic data for a specific type of three-dimensional (3D) object for a selected type of 3D modeling program on a target application on target network device with one or more processors via a cloud communications network comprising one or more public communication networks, one or more private networks, one or more community networks and one or more hybrid networks. At Step 140, the cloud server application stores the received 2D electronic data into a first cloud storage object. At Step 142, the cloud server application selects a blank generic 3D object template for the specific type of 3D object included in received 2D electronic data. At Step 144, the cloud server application creates a preliminary specific 3D object in the selected blank generic 3D object template in a selected mark-up language with the received 2D electronic data stored in the first cloud storage object. In FIG. 10B at Step 146, the cloud server application converts the created preliminary 3D object in the selected mark-up language to a first data format in a first data file. At Step 148, the cloud server application sends to a library application on the server network device the first data file and a final type of 3D object format for the selected type of 3D modeling program. At Step 150, the library application converts the created preliminary 3D object into a final type of 3D object for the selected type of 3D modeling program. At Step 152, the library application on the server network device sends to the target application on the target network device via the cloud communications network the final type of 3D object for the selected type of 3D modeling program in a second data format in a second data file stored in a second cloud storage object.

Method 136 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment in FIG. 10A at Step 138, a cloud server application 30', 30'' on cloud network device 24 with one or more processors, receives two-dimensional (2D) electronic data 13, 15 for a specific type of three-dimensional (3D) object for a selected type of 3D modeling program from a target application 30 on target network device 12, 14, 16 with one or more processors via a cloud communications 18 network comprising one or more public communication networks 76, one or more private networks 72, one or more community networks 74 and one or more hybrid networks 78.

At Step 140, the cloud server application 30', 30'' stores the received 2D electronic data into a first cloud storage object 82.

At Step 142, the cloud server application 30', 30'' selects a blank generic 3D object template for the specific type of 3D object included in received 2D electronic data.

At Step 144, the cloud server application 30', 30'' creates a preliminary specific 3D object in the selected blank generic 3D object template in a selected mark-up language (e.g., XML, etc.) with the received 2D electronic data stored in the first cloud storage object 82.

FIG. 10B at Step 146, the cloud server application 30', 30'' converts the created preliminary 3D object in the selected mark-up language to a first data format in a first data file (e.g., a text data file, etc.).

At Step 148, the cloud server application 30', 30'' sends to a library application 31 on the server network device 24 the first data file and a final type of 3D object format for the selected type of 3D modeling program.

At Step 150, the library application 31 converts the created preliminary 3D object into a final type of 3D modeling object 13', 15' for the selected type of 3D modeling program.

At Step 152, the library application 31 on the server network device 24 sends to the target application 30', 30'' on the target network device 12,14,16 via the cloud communications network 18 the final type of 3D object for the selected type of 3D modeling program in a second data format in a second data file stored in a second cloud storage object 82.

FIG. 11 is a block diagram illustrating a data flow 154 for Method 114 of FIG. 8.

The methods and system described herein creates three dimensional (3D) models from two dimensional (2D) data for building information modeling (BIM) and for other types of modeling. The method and system allow new, 2D, 3D and higher dimensional models to be created for existing 3D modeling programs (e.g., AUTODESK REVIT, AUTODESK INVENTOR, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.).

The new models are used to enhance and extend existing 3D modeling programs. The new models can also be used to directly create physical objects (e.g., windows, doors, etc.) represented by the new models with robots, 3D printers and manufacturing machines.

Creating New Composite 3D Object Models

FIGS. 12A and 12B are a flow diagram illustrating a Method 156 for creating composite three-dimensional (3D) object models. In FIG. 12A at Step 158, a library application on a first server network device with one or more processors receives via a communications network from plural other network devices each with one or more processors a set of plural 3D object models for plural different manufacturers of 3D objects. At Step 160, the library application saves the received set of plural 3D object models in a library associated with the library application. At Step 162, library application on the first server network device receives from the plural other server network device a set of rules and parameters that must be used to create a new composite 3D object model from the set of the received set of plural 3D object models. Selected ones of the rules and parameters include physical limitations and constraints for combining the plural 3D object models into a new composite 3D object model for a new physical 3D object that has not previously existed. At Step 164, the library application saves the received set of rules and parameters in the library associated with the library application. At Step 166, the library application receives a request message from a first server application on the first server network device received from a target application on a target network device with one or more processors via the communications network for creating a first new composite 3D object model from first plural different 3D object models from first plural different manufacturers for a first selected type of 3D modeling program. In FIG. 12B at Step 168, a test is conducted to determine from the library application with the received set of rules and parameters in the library associated with the library application whether the requested first new composite 3D object model can be created. If at Step 168, the requested first new composite 3D object model can be created, at Step 170, the library application a selects blank composite 3D object template for creating the first new composite 3D object model. At Step 172, the library application extracts from the library associated with the library application the first plural different types of 3D object models from the first plural different manufacturers. At Step 174, the library application creates the first new composite 3D object model in the selected blank composite 3D object model template including a description of a 3D object that did not previously exist and includes 3D components from the first plural different types of 3D object models from the first plural different manufactures. At Step 176, the library application sends via the first server application on the first server network device the created first new composite 3D object model for the selected type of 3D modeling program to the target application on the target network device via the communications network.

If at Step 168, the library application determines with the received set of rules and parameters in the library associated with the library application that the requested first new composite 3D object model cannot be created at Step 178, the library application sends a response message via the first server application on the first server network device to the target application on the target network device via the communications network that the first new composite 3D object model cannot be created.

Method 156 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment in FIG. 12A at Step 158 a library application 31 on a first server network device 24 with one or more processors receives via a communications network 18 from a plural other network devices 12, 14, 16, 20, 22, 26 each with one or more processors a set of plural 3D object models 19 for plural different manufacturers of 3D objects.

Step 158 also includes receiving on the library application 31 plural messages from plural different vendors of 3D objects. A "vendor" is a party in a supply chain that makes goods and services available to companies or consumers. A vendor may or may not actually manufacture any goods. A vendor may also create new actual physical 3D objects that have never existed before using existing 3D components from plural manufacturers.

In one embodiment, the plural 3D object models 19 are for actual physical 3D objects. In another embodiment, the 3D object models 19 are for virtual 3D objects. However, the present invention is not limited to such an embodiment and other embodiments can be used to practice the invention.

At Step 160, the library application 31 saves the received set of plural 3D object models 19 in a library 24' associated with the library application. In one embodiment, the library 24' is a database 24' associated with the first server network device 24. In another embodiment, the library 24' is non-transitory and non-volatile storage (e.g., a hard drive, flash storage, etc.) on the first server network device 24 and/or is a cloud storage object 82.

At Step 162, library application 31 on the first server network device 24 receives from the plural other server network devices 12, 14, 16, 20, 22, 24 a set of rules and parameters 21 that must be used to create a new composite 3D object model 23 from the set of the received set of plural 3D object models 19. Selected ones of the rules and parameters 21 include physical limitations and constraints for combining the plural 3D object models 19 into a new composite 3D object model 23 for a new physical 3D object that has not previously existed.

At Step 164, the library application 31 saves the received set of rules and parameters 21 in the library 24' associated with the library application 31.

At Step 166, the library application 31 receives a request message from a first server application 30', 30'' on the first server network device 24 received from a target application 30 on a target network device 12, 14, 16 with one or more processors via the communications network for creating a first new composite 3D object model 23 from first plural different 3D object models 19 from first plural different manufacturers for a first selected type of 3D modeling program.

In FIG. 12B at Step 168, a test is conducted from the library application 31 to determine with the received set of rules and parameters 21 in the library 24' associated with the library application 31 whether the requested first new composite 3D object model 23 can be created.

In one specific exemplary embodiment, the library application 31 only applies to received set of rules and parameters 21 if an actual 3D physical object is going to be created from the first new composite object model 23. In such an exemplary embodiment, the actual 3D physical object can only be created if its 3D components are of a proper size, shape, color, etc. to be combined into the actual 3D physical object. For example, if a new sink is going to be created and the sink basin includes holes pre-drilled with two-inch holes for a first facet with two-inch pipes then such a sink could not be created directly using a facet with three-inch pipes. However, in this exemplary embodiment, if the first new composite object model 23 is only going to be used in a virtual environment (i.e., only on a computer, etc.), the library application 31 will allow any type of new composite 3D object model 23 to be created and used only in the virtual environment.

In this example, the sink basin with two-inch holes could be created with the facet with three-inch pipes and used in the virtual environment only Such virtual-only composite 3D object models are allowed because a manufacturer and/or vendor may desire to physically modify the actual 3D objects to allow creation of a new 3D object. For the sink discussed, a manufacturer of the sink basin with two-inch holes may like the facet with three-inch pipes so much that this manufacturer is willing to drill three-inch holes in its own sink basin to accept the facet. Other virtual only new 3D object models that are not bound by any sets of rules and parameters for other reasons. However, the present invention is not limited to such an embodiment and the invention can be practice with and/or without this exemplary specific embodiment.

If at Step 168, the requested first new composite 3D object model 23 can be created, at Step 170, the library application 31 a selects blank composite 3D object template for creating the first new composite 3D object model 23.

At Step 172, the library application 31 extracts from the library 24' associated with the library application the first plural different types of 3D object models 19 from the first plural different manufacturers.

At Step 174, the library application 31 creates the first new composite 3D object model 23 in the selected blank composite 3D object model template including a description of a 3D object that did not previously exist and includes 3D components from the first plural different types of 3D object models 19 from the first plural different manufacturers.

For example, the created first new composite 3D object model 23 may be for a sink with individual components from manufacturer-A, manufacturer-B, manufacturer-C and manufacturer-D. Such a sink did not previously exist because manufacturer-A used only its own components to produce its own sinks.

The created new composite 3D object model 23 can be used as a 3D object model 23 to create a virtual object used in a 3D modeling program and/or as a 3D object model 23 used to create an actual physical object (e.g., door, window, sink, etc.).

At Step 176, the library application 31 sends via the first server application 30', 30'' on the first server network device 24 the created first new composite 3D object model 23 for the selected type of 3D modeling program (e.g., AUTODESK REVIT, AUTODESK INVENTOR, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.) to the target application 30 on the target network device 12, 14, 16, via the communications network 18.

If at Step 168, the library application 31 determines with the received set of rules and parameters 21 in the library 24' associated with the library application 31 that the requested first new composite 3D object model 23 cannot be created at Step 178, the library application 31 sends a response message via the first server application 30', 30'' on the first server network device 24 to the target application 30 on the target network device 12, 14, 16, 20, 22, 26 via the communications network 18 that the first new composite 3D object model 23 cannot be created.

FIG. 13 is a flow diagram illustrating a Method 180 for creating composite three-dimensional (3D) object models. At Step 182, a first server application on a first server network device with one or more processors sends a created first new composite 3D object model for a selected type of 3D modeling program to a manufacturing application on manufacturing server network device with one or more processors via a communications network. At Step 184, the manufacturing server network device automatically manufactures an actual physical 3D object from the created first new composite 3D object model.

Method 180 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment at Step 182, a first server application 30', 30'' on a first server network device 24 with one or more processors sends a created first new composite 3D object model 23 for a selected type of 3D modeling program (e.g., AUTODESK REVIT, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, SOLIDWORKS, PROE, etc.) to a manufacturing application 33 on manufacturing server network device 26 with one or more processors via a communications network 18, 18'.

At Step 184, the manufacturing server network device 26 automatically manufactures an actual physical 3D object 37 from the created first new composite 3D object model 23.

The manufacturing server network device 26 includes a robot 41, 3D printer 39, or manufacturing machine 35. At Step 134, automatically manufacturing from manufacturing server network device 26, an actual physical 3D object 37 (e.g., door 37, FIG. 1, etc.) from the created first new composite 3D object model 23. However, the present invention is not limited to such an embodiment and other embodiments can be used to practice the invention.

FIG. 14 is a flow diagram illustrating a Method 186 for creating composite three-dimensional (3D) object models. At Step 188, a library application automatically creates from a created first new composite 3D object model a set of architectural drawings, a set of shop drawings, or a set a manufacturing drawings. At Step 190, the library application on the first server network device sends via the communications network to the target application on the target network device the created set architectural drawings, set of shop drawings, or a set a manufacturing drawings.

Method 186 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment at Step 188, a library application 31 automatically creates from a created first new composite 3D object model 23 a set of architectural drawings 43, a set of shop drawings 43', or a set a manufacturing drawings 43''.

At Step 190, the library application 31 on the first server network device 24 sends via the communications network 18 to the target application 30, 30', 30'' on the target network device 12, 14, 16, 20, 22, 26 the created set architectural drawings 43, set of shop drawings 43', or a set a manufacturing drawings 43''.

In another embodiment the library application 31 on the first server network device 24 sends via the communications network 18 the created set architectural drawings 43, set of shop drawings 43', or a set a manufacturing drawings 43'' to a printer to automatically print a copy. However, the present invention is not limited to such an embodiment and other embodiments can be used to practice the invention.

FIG. 15 is a flow diagram illustrating a Method 192 for creating composite three-dimensional (3D) object models. At Step 194, a library application on the first server network device provides via the communications network to plural other network devices each with one or more processors accessed to the received set of plural 3D object models for a plural different manufacturers saved in a library associated with the library application. At Step 196, the library application on the first server network device provides via the communications network to the plural other network devices the received set of rules and parameters saved in the library associated with the library application. At Step 198, the library application accepts requests from the plural other network devices via the communications network to create new composite 3D object models for new physical 3D objects or virtual 3D objects that had not previously existed.

Method 192 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment at Step 194, a library application 31 provides on the first server network device 24 via the communications network 18 to plural other network devices 12, 14, 16, 20, 22, 28 each with one or more processors access to a received set of plural 3D object models 23 for the plural different manufacturers saved in the library 24' associated with the library application 31.

At Step 196, the library application 31 on the first server network device 24 provides via the communications network 18 to the other network devices 12, 14, 16, 20, 22, 28 the received set of rules and parameters 21 saved in the library associated with the library application 31.

At Step 198, the library application 31 accepts requests from the plural other network devices 12, 14, 16, 20, 22, 26 via the communications network 18 to create new composite 3D object models 23 for new physical 3D objects or virtual 3D objects that had not previously existed.

FIG. 16 is a flow diagram illustrating a Method 200 for creating composite three-dimensional (3D) object models. At Step 202, a first server application sends a created first new composite 3D object model for a selected type of 3D modeling program to a vendor application on a vendor server network device with one or more processors via the communications network. At Step 204, an actual physical 3D object is created that had not previously existed from the created first new composite 3D object model using a plural different actual 3D objects from plural different manufacturers or plural different vendors.

Method 200 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment at Step 202, a first server application sends a created first new composite 3D object model 23 for a selected type of 3D modeling program to a vendor application 33' on a vendor server network device (e.g., 24, etc.) with one or more processors via the communications network 18. At Step 204, an actual physical 3D object 37 is created that had not previously existed from the created first new composite 3D object model 23 using a plural different actual 3D objects from plural different manufacturers or plural different vendors.

FIG. 17 is a block diagram 206 illustrating an exemplary three-dimensional (3D) object model 208 for a portion of a building.

FIG. 18 is a flow diagram illustrating a Method 218 for creating composite higher dimensional object models from composited 3D object models. At Step 220, a library application receives time information and cost information for a created new composite 3D object model from another server network device with one or more processors via a communications network. At Step 222, the library application creates a new composite fourth dimension (4D) object model or fifth dimension (5D) object model, wherein the new composite 4D object model includes the created first new composite 3D object model plus the received time information and the new composite 5D model includes the created new composite 4D model plus the received cost information.

Method 218 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment at Step 220, the library application receives 31 time information and cost information for a created new composite 3D object model 23 from another network device 12, 14, 16, 20, 22, 26 with one or more processors via the communications network 18.

At Step 222, the library application 31 creates a new composite fourth dimension (4D) object model or fifth dimension (5D) object model. The new composite 4D object model includes the created first new composite 3D object model 23 plus the received time information and the new composite 5D model includes the created new composite 4D model plus the received cost information.

The methods and system described herein provide the creation of new composite 3D and higher dimensional models from plural different 3D models from plural different manufacturers for existing 3D modeling programs (e.g., AUTODESK REVIT, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, etc.). The new composite 3D models are used to enhance and extend existing 3D modeling programs. The new models can also be used to directly create new physical objects (e.g., windows, doors, etc.) that never existed before with robots, 3D printers and manufacturing machines.

Collaboration on New Composite and Native 3D Object Models

When new and/or existing composite 3D object models or native 3D object models (i.e., a 3D object automatically included and available in a 3D modeling program, etc.) are being used in an existing 3D modeling program (e.g., AUTODESK REVIT, AUTOCAD, SKETCHUP, VECTORWORKS, MICROSTATION, ARCHICAD, etc.), it is desirable to allow and track collaborations on the 3D (or higher dimensional (e.g., 4D, 5D, etc.) and/or lower dimensional (e.g., 1D, 2D, etc.)) and/or X-dimensional model objects and provide analytics associated with the collaboration.

For example, if several architects were working on the walls of a new building that is going to be constructed and a selected wall needs to be removed, a head architect and/or project manager may instruct a desired architect to remove the wall via an e-mail message, via a telephone call, via oral instructions, etc. However, such communication occurs outside of the existing 3D modeling program and needs to be received, acted upon and tracked outside the existing 3D modeling program.

In the present invention a project manager application 30'' is added to an existing 3D modeling program. The project manager application 30'' (e.g., cloud SasS or non-cloud, etc.) provides collaboration between users. For example, if the head architect and/or project manager wanted to remove a selected wall, he/she would select the wall, which is a component of an existing native and/or extended 3D composite model and provide appropriate instructions. Upon exiting, the selected wall would appear in the existing 3D modeling program in a different color and include an electronic communications link to the desired instructions (e.g., Jane, please remove the wall and re-configure the section drawing, etc.).

When the desired architect opens the existing 3D modeling program and the desired 3D object, the selected wall with the collaboration instructions would be visible. The desired architect could then click on the link and follow the instructions. The desired architect can add his/her own comments as well. The head architect and/or project manager is also able to send make Jane aware of the instructions with an instant message, a text message, an e-mail, a social media message (e.g., tweet, post, etc.) that includes an electronic link directly into the 3D object and the existing 3D modeling program. Upon completion of the task, the creator of the task is informed electronically the task has been completed (and/or not yet completed) and analytics are collected and displayed.

FIGS. 19A and 19B are a flow diagram illustrating a Method 224 for collaborating on three-dimensional (3D) object models.

FIG. 20 is a block diagram 244 illustrating exemplary collaboration information for 3D object models in a 3D modeling program. FIG. 20 includes an exemplary 3D object model 246 for a wall with an electronic communications path portion 248 with a blinking red (not visible in static black and white drawings) dashed lined 250 to give a visual indication that collaboration information 252 is available. FIG. 20 also indicates a social media message 257 (e.g., a TWITTER tweet, etc.) including an electronic communications path 250 as an electronic link 250' from the social media message 257 directly to collaborative electronic information 256 for a selected portion 214, 248 of a desired 3D object model 208, 246 in an existing 3D modeling program. When the electronic link 250, 250' is activated, the network device which activated the electronic link 250, 250' is connected directly into the existing 3D modeling program and directly to collaborative electronic information 256 for a selected portion 214, 248 of a desired 3D object model 208, 246.

In FIG. 19A at Step 226, a first collaboration message is received on a project management application on a three-dimensional (3D) modeling program executing on a server network device with one or more processors via a communications network from a target network device with one or more processors. The first collaboration message including a list of sets of one or more X-dimensional (XD) object models for a selected project and a request to select a portion of a desired XD object model from the set of one or more XD object models. The project management application is being accessed by the target network device via the 3D modeling program. At Step 228, the project management application creates a collaboration object on the server network device. At Step 230, the project management application sends to the target network device via the communications network a second collaboration message including electronic instructions for the 3D modeling program to display a dialog box on the 3D modeling program being accessed from the target network device to accept collaboration instructions. At Step 232, a request message is received on the project management application from the target network device via the communication network with collaborative electronic information for the selected portion of the desired XD object model. In FIG. 19B, at Step 234, the project management application saves the collaborative electronic information from the second request message in the created collaboration object.

At Step 236, from the project management application creates an electronic communication path link to the created collaboration object. Executing the electronic communications path link from any target network device provides access to the collaborative electronic information on the target network device via the project management application executing in the 3D modeling programming. At Step 237, the project management application associates the electronic communications path link to the created collaboration object for the selected portion of the desired XD object model for the 3D modeling program. At Step 238, the project management application changes one or more visual display characteristics of the electronic communication path from to visually indicate collaboration information is available for the selected portion of the desired XD object model for the 3D modeling program, thereby providing visual indications collaboration information is available for 3D objects models for the 3D modeling program. At Step 240, the project management application sends to one or more other target network devices each with one or more processors via the communications network a third collaboration message indicating collaboration information is available for one or more XD objects models for the 3D modeling program, thereby providing collaboration for XD object models for the 3D modeling program and improving a utility and an efficiency of the 3D modeling program.

Method 224 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment in FIG. 19A at Step 226, a first collaboration message is received on a project management application 30'' on a three-dimensional (3D) modeling program executing on a server network device 20 with one or more processors via a communications network 18 from a target network device 12 with one or more processors. The first collaboration message including a list of sets of one or more X-dimensional (XD) object models (e.g., 208, 276, 286, etc.) for a selected project and a request to select a portion 210, 248, 288, 290, etc. of a desired XD object model from the set of one or more XD object models. The project management application 30'' is being accessed by the target network device 12 via the 3D modeling program.

In a preferred embodiment, the first collaboration message includes an instant message (IM), a short message service (SMS) (i.e., text message, etc.) message, an e-mail message, a social media message (e.g., post, tweet, etc.), wearable network device 106-112 message, UAV 27 message, UGV 29 message and/or other type of message. However, the present invention is not limited to these message types and other message types can be used to practice the invention.

In a preferred embodiment, the 3D modeling program includes a Building Information Modeling (BIM) modeling program. However, the present invention is not limited to this embodiment and other modeling programs can be used to practice the invention.

In a preferred embodiment, the BIM modeling program includes an AUTODESK REVIT 3D modeling program, an AUTOCAD 3D program, a VECTORWORKS 3D modeling program, a MICROSTATATION 3D modeling program, an ARCHICAD 3D modeling program, and/or a SKETCHUP 3D modeling program.

In a preferred embodiment, the XD modeling object includes an AUTODESK REVIT 3D modeling object, an AUTODESK INVENTOR 3D modeling program, an AUTOCAD 3D modeling object, a VECTORWORKS 3D modeling object a MICROSTATATION 3D modeling object, an ARCHICAD 3D modeling object, a SOLIDWORKS 3D modeling object, a PROE 3D modeling object, and/or or a SKETCHUP 3D modeling object.

In a preferred embodiment, the XD modeling object includes an existing native composite 3D modeling object, a new composite 3D modeling object (e.g., 208, etc.) and/or a new composite lower (e.g., 2D, etc.) and/or a higher dimension (e.g., 4D, etc.) modeling object.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

For example, a supervisor/manager Sally may use a UAV 27 (FIG. 1) to fly around an actual job site for a new building and/or be completing a post build punch list and notice a portion of a wall is incorrect. Sally may generate the first request message directly from the UAV 27 with application 30, 30'. The first request message may include digital photos, audio, video, etc. captured directly from the UAV 27.

As another example, a supervisor/manager Sally uses a unmanned ground vehicle (UGV) 29 (FIG. 1) to drive around an actual job site for a new building and/or be completing a post build punch list and notice a portion of a wall is incorrect. Sally may generate the first request message directly from the UGV 29 with application 30, 30'. The first request message may include digital photos, audio, video, etc. captured directly from the UGV 29.

As another example, a supervisor/manager Sally uses a mobile network device 12, 14 (e.g., smart phone, tablet, etc.) and/or a wearable network device 106-112 (FIG. 7) as she walks around an actual job site for a new building and/or be completing a post build punch list and notice a portion of a wall is incorrect. Sally may generate the first request message directly from the mobile network device 12, 14 and/or wearable network device 106-112 with application 30, 30'. The first request message may include digital photos, audio, video, etc. captured directly from the wearable network device 106-112.

As another example, a supervisor/manager Sally uses a UGV 29' (FIG. &) to drive around with a detachable/attachable UAV 27' (FIG. 7) to fly around an actual job site for a new building and/or be completing a post build punch list and notice a portion of a wall is incorrect. Sally may generate the first request message directly from the UGV 29' and/or UAV 27' with application 30, 30'. The first request message may include digital photos, audio, video, etc. captured directly from the UGV 29' and/or UAV 27'.

As another example, a supervisor/manager Sally uses a mobile network device 12, 14 (e.g., smart phone, tablet, etc.) and/or a wearable network device 106-112 (FIG. 7) or control a UAV 27, 27' and/or UGV 29, 29' and read RFID tags 99 attached to actual building components she walks around an actual job site for a new building and/or be completing a post build punch list and notice a portion of a wall is incorrect.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

For example, FIG. 20 illustrates selection of a portion 248 of 3D object model 246 for a wall in a new building being built. However, the present invention is not limited to this exemplary embodiment and other embodiments, can be used to practice the invention.

At Step 228, the project management application 30'' creates a collaboration object on the server network device 20.

In a preferred embodiment the collaboration object includes a cloud storage object 82. In another embodiment, the collaboration is non-cloud storage object (e.g., for database 20' associated with server network device 20, etc.). However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

At Step 230, the project management application 30'' sends to the target network device 12 via the communications network 18 a second collaboration message including electronic instructions for the 3D modeling program to display a dialog box 254 on the 3D modeling program being accessed from the target network device 12 to accept collaboration instructions 256.

In a preferred embodiment, the second collaboration message includes an instant message (IM), a short message service (SMS) (i.e., text message, etc.) message, an e-mail message, a social media message (e.g., post, tweet, etc.), wearable network device 106-112 message, UAV 27 message, UGV 29 message and/or other type of message. However, the present invention is not limited to these message types and other message types can be used to practice the invention.

In one embodiment, the dialog box 254 (FIG. 20) is created and used in real-time (i.e., direct immediate interaction 1-2 seconds or less response time, etc.) and allows, text, audio, video, etc. information to be inserted and/or attached by the target device 12. The dialog box 245 includes an instant message (IM) dialog box, a short message service (SMS) (i.e., text message, etc.) dialog box, a social media dialog box (e.g., FACEBOOK, TWITTER, etc.) and/or a graphical dialog box.

For example, a manager or supervisor named Sally determines that an architect Jane is working in a 3D modeling program on the 3D object model 246 for the wall. Sally selects portion 248 and dialog box 254 pops up. Sally enters audio information 255, text information 256, etc. into the dialog box 254 "JANE, PLEASE REMOVE THE WALL AND RECONFIGURE THE SECTION DRAWING--SALLY." Jane is aware in real-time that Sally made this request for her to alter the 3D object model 246 for the wall.

As another example, Sally may select a window 208 from FIG. 17 of 3D object model 208 and ask Jane to cut a wall section through this window 210, etc.

As another example, Sally may select a window 288 (FIG. 24) and ask Jane to remove this window and select a lobby floor 290 (FIG. 24) and ask Bob to add carpeting to the lobby floor.

However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

In another preferred embodiment, the dialog box 254 is not used in real-time. In such an embodiment, the dialog box 254 is used in a more static manner.

For example, Sally enters Sally enters audio information 255, text information 256, etc. into the dialog box 254 "JANE, PLEASE REMOVE THE WALL AND RECONFIGURE THE SECTION DRAWING--SALLY" late in the evening. The next morning when Jane arrives at work and opens the 3D modeling program she observes (e.g., Steps 240, 242, etc.) that there is a message associated with portion 248 of the 3D object model 246 for the wall with the 3D modeling program and/or has received an instant message, text message, e-mail message, social media message, etc. from Sally with an electronic link directly to the portion 248 of the 3D object model 246 for the wall with the 3D modeling program. Jane selects the portion 248 directly from and inside of the 3D modeling program and/or from an electronic link in the instant message, text message, e-mail message, social media message, etc., and receives the instructions Sally entered into the dialog box 254 the night before.

However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

In FIG. 19B at Step 234, a request message is received on the project management application 30'' from the target network device 12 via the communication network 18 with collaborative electronic information 256 for the selected portion 214, 248 of the desired 3D object model 208, 246.

In a preferred embodiment, the request message includes an instant message (IM), a short message service (SMS) (i.e., text message, etc.) message, an e-mail message, a social media message (e.g., post, tweet, etc.), wearable network device 106-112 message, UAV 27 message, UGV 29 message and/or other type of message. However, the present invention is not limited to these message types and other message types can be used to practice the invention.

In a preferred embodiment, the received collaborative electronic information 256 includes electronic text 256, graphical information, digital photographs, audio information 255 or video information. However, the presenting information is not limited to these information types and more, fewer and/or other types of collaborative information can be used to practice the invention.

In a preferred embodiment, the received collaborative electronic information 256 received at Step 234 includes an individual assignment, an open assignment and/or a group assignment.

FIG. 23 is a block diagram 272 illustrating exemplary collaboration 274 on an exemplary three-dimensional (3D) object model 276 for a window. The exemplary collaboration 274 includes an individual assignment 278, an open assignment 280 and/or a group assignment 282.

FIG. 24 is a block diagram illustrating another exemplary three-dimensional (3D) object model 286 for a building 288. The 3D building object model 286 includes old collaboration instructions 210 (FIG. 17) and two new sets of collaboration instructions for Jane 288 and Bob 290 for the 3D object model 286 including the exemplary building 288.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

Returning to FIG. 19B at Step 234, the project management application 30'' saves the collaborative electronic information 256 from the second request message in the created collaboration object (e.g., cloud storage object 82, etc.).

At Step 237, the project management application 30'' associates the electronic communications path link 250 to the created for the selected portion 248 of the desired 3D object model 246 for the 3D modeling program. Executing the electronic communications path link 250 from any target network device 12, 14, 16 provides immediate access to the collaborative electronic information 256 on the target network device 12, 14, 16 via the project management application 30'' executing in the 3D modeling program.

In a preferred embodiment, the electronic communications path link includes an instant message (IM), short message service (SMS), social media (e.g., post, tweet, etc.), cellular telephone, data (e.g., HTTP, HTTPs, TCP/IP, UDP/IP, M2M, NFC, Bluetooth, etc.), audio and/or video electronic communications path link.

In another preferred embodiment, the project management application 30'' inserts the electronic communications path link 250' to the created collaboration object for the selected portion 248 of the desired 3D object model 246 for the 3D modeling program into an instant message (IM), short message service (SMS), social media message 257 (e.g., post, tweet, etc.), e-mail message, and/or data message, etc. that is sent to a desired architect.

FIG. 20 illustrates an exemplary social media message 257 (e.g., a TWITTER tweet, etc.) including an exemplary electronic communications path 250 as an electronic link 250' (e.g., with the text "JANE, DO THIS TODAY" defining the electronic link 250') from the social media message 257 directly to collaborative electronic information 256 for the selected portion 214, 248 of the desired 3D object model 208, 246 in the existing 3D modeling program. When the electronic link 250' is activated from the social media message 257, the network device 12, 14, 16 which activated the electronic link 250' is connected directly into the existing 3D modeling program and directly to the collaborative electronic information 256 (e.g., JANE, PLEASE REMOVE THIS WALL AND RECONFIGURE THE SECTION DRAWING, etc.) for the selected portion 214, 248 of the desired 3D object model 208, 246.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

Returning to FIG. 19B at Step 238 the project management application 30'' changes 252 (one or more visual display characteristics such as color, font, type of line from solid to dashed, from non-blinking to blinking, to a new shape view, etc.) of the electronic communications path link 250 from to visually indicate collaboration information 252 is available for the selected portion 248 of the desired 3D object model 246 for the 3D modeling program, thereby providing visual indications directly within the 3D modeling program that collaboration information is available for 3D objects models for the 3D modeling program.

For example, in FIG. 20 the color of the portion 248 of the 3D object model 246 was changed to red and the line was changed from a solid line to a blinking dashed line, etc. (the blinking and red color are not directly visible in black and white drawings). However, the present invention is not limited to this embodiment and other embodiments may be used to practice the invention.

At Step 240, the project management application 30'' sends to one or more other target network devices 14, 16, 20, 22, 24, 26 each with one or more processors via the communications network 18 a third collaboration message (e.g., instant message, text message, e-mail message, social media message, etc.) indicating collaboration information 252 is available for one or more 3D objects models 246 for the 3D modeling program, thereby providing collaboration for XD object models for the 3D modeling program and improving a utility and an efficiency of the 3D modeling program to complete actual tasks.

In a preferred embodiment, the third collaboration message includes an instant message (IM), a short message service (SMS) (i.e., text message, etc.) message, an e-mail message, a social media message (e.g., post, tweet, etc.), wearable network device 106-112 message, UAV 27 message, UGV 29 message and/or other type of message. However, the present invention is not limited to these message types and other message types can be used to practice the invention.

In a preferred embodiment, after Method 244 is complete, the project management application 30'' automatically creates a new set of architectural drawings 43, shop drawings 43', or manufacturing drawings 43'' including the collaboration information.

FIG. 21 is a flow diagram illustrating a Method 258 for collaborating on X-dimensional (XD) object models. At Step 260, a fourth collaboration message is received on the project management application from a second target network device with one or more processors via the communication network with new collaborative electronic information for the selected portion of the desired 3D object model. At Step 262, the project management application stores the new collaborative electronic information. At Step 264, the project management application creates plural collaboration analytics with the new collaborative electronic information. At Step 266, the project management application sends a fifth collaboration message to the target network device via the communications network including the plural collaboration analytics information for displaying on the target network device, thereby allowing collaboration for the selected portion of the desired 3D object model to be tracked and analyzed.

Method 258 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment, at Step 260 a fourth collaboration message is received on the project management application 30'' from a second target network device 14, 16, etc. with one or more processors via the communication network 18 with new collaborative electronic information for the selected portion 214, 248 of the desired 3D object model 208, 212.

At Step 262, the project management application 30'' stores the new collaborative electronic information.

In a preferred embodiment, the new collaborative electronic information is stored in a new cloud storage object 82'. In another embodiment, the new collaborative electronic information is store in the same cloud storage object as the original collaborative electronic information. In another embodiment, new collaborative electronic information is stored in a non-cloud storage object (e.g., for database 20' associated with server network device 20, etc.).

At Step 264, the project management application 30'' creates plural collaboration analytics with the new collaborative electronic information.

In a preferred embodiment, the plural collaboration analytics 270, 282 are created in various types of hierarchies (e.g., priority tree, building component, project, team, etc.) and be selected at different levels in the hierarchies and also can be selected to isolate a desired 3D object model 208, 246, an architect, a floor, a room, a wall 246, a department, an individual 276, open 278 or group 280 project, etc. without revealing any of the other created plural collaboration analytics. These feature allows multiple teams, companies, organizations, architectural firms, construction companies, etc. to use the claim invention without interference and without providing access to proprietary project information and/or proprietary financial and/or cost and/or productivity information.

In another preferred embodiment, all other created plural collaboration analytics can be revealed and accessed.

However, the present invention is not limited to these exemplary embodiments, and other embodiments can also be used to practice the invention.

At Step 266, the project management application 30'' sends a fifth collaboration message to the original target network device 12 and/or other target 14, 16 and/or server network devices 22, 24, 26 via the communications network 18 including the plural collaboration analytics information 270, 282 for displaying on the original target network device 12, thereby allowing collaboration for the selected portion 214, 248 of the desired 3D object model 208, 246 to be tracked and analyzed with analytics collected.

Table 6 illustrates other exemplary analytics information collected and displayed with Method 258. However, the present invention is not limited to these exemplary analytics and other types of analytics can be used to practice the invention.

TABLE-US-00006 TABLE 6 Group Statistics 10 Walls were created today 2 Windows were placed today 3 floors were created today 4 Drawings were approved today Individual Statistics 4 drawings were accessed 3 were completed 2 were approved 5 walls were altered today 18 elements were modified

However, the present invention is not limited to this exemplary embodiment and other embodiments may be used to practice the invention.

In one preferred embodiment, after Method 258 is complete, the project management application 30'' automatically creates a new set of architectural drawings 43, shop drawings 43', or manufacturing drawings 43'' including the new collaboration information.

In another preferred embodiment, after Method 258 is complete, the project management application 30'' automatically sends a sixth collaboration message including the new set of architectural drawings 43, shop drawings 43', or manufacturing drawings 43'' including the new collaboration information via the communications network 18 to a manufacturing server network device (e.g., 26, etc.) with one or more processors to automatically manufacture with a machine 35 and/or a 3D printer 39 and/or a robot 41 an actual physical 3D object 208, 246 from the 3D object model 208, 246 on which collaboration has occurred.

In a preferred embodiment, the third, fourth, fifth and sixth collaboration messages include an instant message (IM), a short message service (SMS) (i.e., text message, etc.) message, an e-mail message, a social media message (e.g., post, tweet, etc.), wearable network device 106-112 message, UAV 27 message, UGV 29 message and/or other type of message. However, the present invention is not limited to these message types and other message types can be used to practice the invention.

However, the present invention is not limited to this exemplary embodiments and other embodiments may be used to practice the invention.

FIG. 22 is a block diagram 268 illustrating exemplary collaboration analytics 270 created for three-dimensional (3D) object models 208, 246. The exemplary collaboration include, but are not limited to a list of the collaborative task, its completion status (e.g., completed, not completed, etc.) a cumulative time spent completing the task, a cost, etc. However, the present invention is not limited to this exemplary analytic information and more, fewer and other types of analytics may be used to practice the invention.

The method and system described herein provide creating composite three dimensional (3D) models for building information modeling (BIM) with collaboration and analytics. The method and system allow real-time and static collaboration on native and new composite XD object models from existing 3D BIM programs (e.g., AUTODESK REVIT, AUTOCAD, etc.). Collaboration analytics are collected and displayed.

The methods and system described herein are primarily described for 3D object models. However, the present invention is not limited to 3D object models and lower dimensional (e.g., 1D or 2D, etc.) or higher dimensional (e.g., 4D, 5D, etc.) XD object models can be used to practice the invention.

Automated Building Information Modeling (BIM) Collaboration

FIGS. 25A and 25B are a flow diagram illustrating a Method 294 for automated building information modeling (BIM) collaboration.

FIG. 26 is a block diagram 312 illustrating exemplary collaboration information for 3D object models in a 3D modeling program.

FIG. 27 is a block diagram 326 illustrating a path 330 of a mobile target device 12 through an XD object model 328 in a 3D modeling program. In FIG. 27 the XD object model 328 includes an object model for a floor in a building.

In a preferred embodiment, an actual mobile target device 12, 14, 16, 106-110, 27, 29 is used to follow the path 330 through an actual physical location obtaining and sending actual physical collaborative information back to the project management application 30'' on the server network device 20. The actual physical collaborative information includes GPS and/or location coordinates.

In one preferred embodiment, actual mobile target device 12, 14, 16, 106-110, 27, 29, includes an actual camera 331 or a network device with a camera component (e.g., 12, etc.) and collects actual electronic collaborative information 254' with a camera point-of-view viewing cone 333 including GPS 322' and/or other location coordinates at the actual physical locations. This helps ensure a person does not have to guess or estimate the precise physical location 322' and/or estimate the precise point-of-view captured in electronic collaborative information 254'.

In such an embodiment, another person can return to the physical location (e.g., 322', etc.) and precisely find the exact physical location at which the camera 331 was placed to re-view the exact same camera point-of-view cone 333 using the GPS 322' and/or other location coordinates.

In another preferred embodiment, as is illustrated in FIG. 27, a virtual mobile target device 12' 335 is used to follow the path 330 through a virtual place obtaining and sending virtual collaborative information that can be sent from the project management application 30'' on the server network device 20 back to an actual mobile target device 12, 14, 16, 106-110, 27, 29 to be used a physical project site. In such an embodiment, a person using an actual mobile target device 12, 14, 16, 106-110, 27, 29 can compare the virtual collaborative information to what he or she actually sees at the project site.

In yet another preferred embodiment, as is illustrated in FIG. 27, a virtual camera 335 is used and collects virtual collaborative information with a virtual camera point-of-view viewing cone 337 including GPS 322'' and/or other location coordinates in the 3D modeling program.

In such an embodiment, a person can visit the physical location (e.g., 322'', etc.) and precisely find the exact physical location at which the virtual camera 335 was placed to re-view the exact same camera point-of-view cone 337 for the virtual collaborative information used in the 3D modeling program using the GPS 322'' and/or other location coordinates.

This helps ensure the person does not have to guess or estimate the precise physical location 322'' and/or precise point-of-view captured in virtual collaborative information from within the 3D modeling program. The person can then compare the virtual collaborative information to what he or she actually sees at the project site.

However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

Returning to FIG. 25A at Step 296, a first message is received on a project management application on a three-dimensional (3D) modeling program executing on a server network device with one or more processors via a communications network from a mobile target network device with one or more processors. The first message includes electronic collaborative information for one or more actual physical components at a physical location associated with a set of one or more virtual components in one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program collected from a camera, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on the mobile target network device. At Step 298, the project management application creates a collaboration object on the server network device. At Step 300, the project management application store the received electronic collaboration information from the first message in the collaboration object. At Step 302, the project management application creates an electronic communications path link to the created collaboration object. At Step 304, an electronic communications path link is inserted into the created collaboration object from the project management application for the set of one or more virtual components in one or more virtual X-dimensional (XD) object models for the selected project in the 3D modeling program. In FIG. 25B at Step 306, one or more visual display characteristics of the electronic communications path link are changed from from the project management application to visually indicate collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program. At Step 308, the project management application sends to the mobile target network device via the communications network a second message indicating collaboration information is available for the one or more XD objects models on the 3D modeling program, thereby providing collaboration information for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program. At Step 310, the project management application sends to one or more other network devices each with one or more processors via the communications network a third message indicating collaboration information is available for the one or more actual physical components at the physical location associated with the set of one or more virtual components in the one or more virtual X-dimensional (XD) object models for a selected project in the 3D modeling program.

Method 294 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment, in FIG. 25A at Step 296, a first message is received on a project management application 30'' on a three-dimensional (3D) modeling program executing on a server network device (e.g., 20, etc.) with one or more processors via a communications network from a mobile target network device 12, 14, 16, 106-110, 27, 29 with one or more processors.

The first message includes electronic collaborative information 254' for one or more actual physical components (e.g., 314', 318', 320', etc.) at a physical location (e.g., 322, etc.) associated with a set of one or more virtual components 314, 316, 318, 320 in one or more virtual X-dimensional (XD) object models (e.g., 246, etc.) for a selected project in the 3D modeling program collected from a camera, bar code (e.g., QR 98, etc.) reader, radio frequency identifier (RFID) 99 reader, location identifier component (e.g., GPS 322, etc.) or other sensor component on the mobile target network device 12, 14, 16, 106-110, 27, 29.

In a preferred embodiment, the electronic collaborative information 254' for one or more actual physical components (e.g., 314', 318', 320', etc.) at a physical location (e.g., 322, etc.) associated with a set of one or more virtual components 314, 316, 318, 320 in one or more virtual X-dimensional (XD) object models (e.g., 246, etc.) for a selected project in the 3D modeling program collected from a camera (e.g., 331, FIG. 27, etc.) and/or other network device with a camera component and includes electronic collaborative information with an actual camera point-of-view viewing cone 333 (FIG. 27) including GPS 322' and/or other location coordinates.

In such an embodiment, another person can return to the physical location (e.g., 322, etc.) and precisely find the exact physical location at which the camera was placed to re-view the exact same camera point-of-view cone 333 using the GPS 322 and/or other location coordinates.

In a preferred embodiment, the 3D modeling program includes a Building Information Modeling (BIM) modeling program.

In a preferred embodiment, the BIM modeling program includes an AUTODESK REVIT 3D modeling program, an AUTOCAD 3D program, a VECTORWORKS 3D modeling program, a MICROSTATATION 3D modeling program, an ARCHICAD 3D modeling program, and/or a SKETCHUP 3D modeling program.

In a preferred embodiment, the XD modeling object includes an AUTODESK REVIT 3D modeling object, an AUTODESK INVENTOR 3D modeling program, an AUTOCAD 3D modeling object, a VECTORWORKS 3D modeling object a MICROSTATATION 3D modeling object, an ARCHICAD 3D modeling object, a SOLIDWORKS 3D modeling object, a PROE 3D modeling object, and/or or a SKETCHUP 3D modeling object.

In a preferred embodiment, the XD modeling object includes an existing native composite 3D modeling object, a new composite 3D modeling object (e.g., 208, etc.) and/or a new composite lower (e.g., 2D, etc.) and/or a higher dimension (e.g., 4D, etc.) modeling object.

In one preferred embodiment, one or more of XD modeling object models from the set of one or more virtual components 252 for the selected project in the 3D modeling program has not yet been created. In such an embodiment the one or more actual physical components at the physical location have been added to the physical location but do not yet exist in the 3D modeling program and now must be added as virtual components to keep the selected project synchronized in the 3D modeling program.

For example, an architect may have gone to a construction site and ask a carpenter to add a selected type of window to a selected wall while on a project site while she is at the project site. An XD object model then needs to be created for the selected type of window and then the newly created XD window object model for the window must be added in the proper location in the XD object model for the virtual wall the new virtual window is located in the 3D modeling program. The architect may use a target network device 12, 14, 16, 106-110, 27, 299 to take a picture and/or video of the selected type of window at the project site and/or add an audio message and send the collaborative information back to the project management application 30'' in the 3D modeling program executing on the server network device. Another architect may then click on the collaborative information after execution of the methods herein, view the picture of the window, create an XD object model for the window, and add the newly created XD object model to an appropriate existing XD object model for a wall within the 3D modeling program.

Thus, the present invention is used to associate existing XD object models for virtual objects that have already been created in the 3D modeling program with actual physical objects as they are added to a project site for the selected project to keep selected project synchronized. It is also used to associate new physical objects that are added to a project site and create new XD object models for new virtual objects that have not yet been created in the 3D modeling program to also keep selected project synchronized. Therefore, the present invention provides two-way, multimedia collaboration within 3D modeling programs.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

Returning to FIG. 25A at Step 296, for example, as is illustrated in FIG. 26, plural actual 2.times.4 wall studs (314', 318', 320') are one or more actual physical components for an actual wall 248'. These actual physical components are associated with a set of one or more virtual components 314, 316, 318, 320 in one or more virtual X-dimensional (XD) object models (e.g., 3D object model 246 for wall 248, etc.) for a selected project in the 3D modeling program (e.g., AUTODESK REVIT, etc.).

In a preferred embodiment mobile target network device 12, 14, 16, 106-110, 27, 299 provides the electronic collaborative information 254', for example, by reading RFID 99 tags and/or QR bar code tags 98 on the plural actual 2.times.4 wall studs (314', 318', 320') (only one of each RFID 99 and QR bar code 98 tags are illustrated for simplicity).

In another preferred embodiment mobile target network device 12, 14, 16, 106-110, 27, 299 provides the electronic collaborative information 254' with a camera component collecting digital pictures or video information.

A GPS component on the mobile target network device 12, 14, 16, 106-110, 27, 29 also can provide precise location information 322 of 2x4 wall studs (e.g., 318', etc.) (only one GPS coordinate with elevation is illustrated for simplicity).

The precise location information 322 of the 2.times.4 wall studs can also be used to provide information that a 2.times.4 wall stud may be in a wrong location (e.g., occupy a space where a door or window should be, etc.) space, a wrong distance from another 2.times.4 stud, etc.

In a preferred embodiment the mobile target network device 12, 14, 16, 106-110, 27, 29 provides, electronic text, audio, digital pictures, video, etc. electronic collaborative information 254' including for example a notation that a 2.times.4 wall stud 306' is missing and has not yet been added to the actual wall associated with 3D object model 246 for the wall 248 and/or the 2.times.4 wall studs at the actual location should have instead been 2.times.6 wall studs since the wall studs in the 3D object model 246 for the wall included 2.times.6 wall studs instead of 2.times.4 wall studs, and/or actual 2.times.4 314' is has a big crack 324 in it and is severely damaged and needs to be replaced, etc. The electronic collaborative information is also used to verify there is one-to-one correlation between all actual physical objects and all virtual objects in the 3D modeling program.

In a preferred embodiment, the UAV 27, GAV 29 and/or combinations UAV 27'/GAV 29' fly and/or drive around actual location and generate the electronic collaborative information 252'. The electronic collaborative information 252' is collected from a camera, bar code (e.g., QR Code 98, etc.) reader, radio frequency identifier (RFID) 99 reader, location identifier component (e.g., GPS 322, etc.) or other sensor component on the UAV 27, GAV 29 and/or combination UAV 27'/GAV 29'. In such an embodiment, an operator of the UAV 27, GAV 29 and/or combination UAV 27'/GAV 29' can be remote to a project site and operate safely through the project site without being at the actual project site.

However, the present invention is not limited to these preferred embodiments and other embodiments may be used to practice the invention.

At Step 298, the project management application 30'' creates a collaboration object 82 on the server network device 20.

In a preferred embodiment the collaboration object includes a cloud storage object 82. In another embodiment, the collaboration is non-cloud storage object (e.g., for database 20' associated with server network device 20, etc.). However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

At Step 300, the project management application 30'' stores the received electronic collaboration information 254, 254' from the first message in the collaboration object 82.

At Step 302, the project management application 30'' creates an electronic communications path link 250 to the created collaboration object 82.

In a preferred embodiment, the electronic communications path link includes an instant message (IM), short message service (SMS), social media (e.g., post, tweet, etc.), cellular telephone, data (e.g., HTTP, HTTPs, TCP/IP, UDP/IP, M2M, NFC, Bluetooth, etc.), audio and/or video electronic communications path link. However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

At Step 304, an electronic communications path link 250' is inserted into the created collaboration object 82 from the project management application 30'' for the set of one or more virtual components 252 in one or more virtual X-dimensional (XD) object models 246 for the selected project in the 3D modeling program.

In a preferred embodiment, the project management application 30'' inserts the electronic communications path link 250' to the created collaboration object for the selected portion 248 (e.g., wall portion 248, etc.) of the desired 3D object model 246 for the 3D modeling program into an instant message (IM), short message service (SMS), social media message 257 (e.g., post, tweet, etc.), e-mail message, and/or data message, etc. that is sent to a desired architect. However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

In FIG. 25B at Step 306, one or more visual display characteristics 252, 252' of the electronic communications path link are changed from the project management application 30'' to visually indicate collaboration information 254, 254' is available for the one or more actual physical components 314', 318', 320' at the physical location associated with the set of one or more virtual components 314, 316, 318, 320 in the one or more virtual X-dimensional (XD) object models 246 for a selected project in the 3D modeling program.

For example, in FIGS. 20 and 26 the color of the portion 248 of the 3D object model 246 was changed to red and the line was changed from a solid line to a blinking dashed line, etc. (the blinking and red color are not directly visible in black and white drawings). However, the present invention is not limited to this embodiment and other embodiments may be used to practice the invention.

At Step 308, the project management application 30'' sends to the mobile target network device 12 via the communications network 18 a second message indicating collaboration information 254, 254' is available for the one or more XD objects models 246 on the 3D modeling program, thereby providing collaboration information 254, 254' for the one or more actual physical components 314', 318', 320' at the physical location 322 associated with the set of one or more virtual components 314, 316, 318, 320 in the one or more virtual X-dimensional (XD) object models 246 for a selected project in the 3D modeling program.

In a preferred embodiment, the second message includes an instant message (IM), short message service (SMS) message, social media message, e-mail message, data message, audio message and/or video message. However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

At Step 310, the project management application 30'' sends to one or more other network devices 14, 16, 22, 24, 26 each with one or more processors via the communications network 18 a third message indicating collaboration information 254, 254' is available for the one or more actual physical components 314', 318', 320' at the physical location 322 associated with the set of one or more virtual components 314, 316, 318, 320 in the one or more virtual X-dimensional (XD) object models 246 for a selected project in the 3D modeling program.

In a preferred embodiment, the third message includes an instant message (IM), short message service (SMS) message, social media message, e-mail message, data message, audio message and/or video message. However, the present invention is not limited to these embodiments and other embodiment can be used to practice the invention.

In one preferred embodiment, after Method 284 is complete, the project management application 30'' automatically creates a new set of architectural drawings 43, shop drawings 43', or manufacturing drawings 43'' including the new collaboration information 254.

In one preferred embodiment, Method 294 is also used to collect exemplary collaboration analytics 270 created for three-dimensional (3D) object models 208, 246 (FIG. 22). The exemplary collaboration includes, but are not limited to a list of the collaborative task, its completion status (e.g., completed, not completed, etc.) a cumulative time spent completing the task, a cost, etc. However, the present invention is not limited to this exemplary analytic information and more, fewer and other types of analytics may be used to practice the invention.

FIG. 28 is a flow diagram illustrating a Method 332 for automated building information modeling (BIM) collaboration. At Step 334, a new X-Dimensional (XD) Building Information Modeling (BIM) object model is created for the 3D modeling program that did not previously exist in the 3D modeling program for one or more actual physical components at the physical location obtained from the first message received at Step 296 (FIG. 25A). At Step 336, the created new X-Dimensional BIM object model for the 3D modeling program is associated with one or more actual physical components and the electronic collaboration information from the received first message.

Method 332 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment, at Step 334, a new X-Dimensional (XD) Building Information Modeling (BIM) object model is created for the 3D modeling program that did not previously exist (e.g., 210, 290, etc.) in the 3D modeling program for one or more actual physical components (e.g., 210, FIG. 24) at the physical location (e.g., 322, FIG. 26) obtained from the first message received at Step 296 (FIG. 25A).

At Step 336, the created new X-Dimensional BIM object model for the 3D modeling program is associated with one or more actual physical components and the electronic collaboration information from the received first message.

FIG. 29 is a flow diagram illustrating a Method 338 for automated building information modeling (BIM) collaboration. At Step 340, plural information is collected from the camera, bar code reader, radio frequency identifier (RFID) reader, location identifier component or other sensor component on the mobile target network device about plural actual physical components at a physical location for a desired project. The actual physical components correspond to plural virtual components for one or more X-dimensional (XD) object models that already exist in the three-dimensional (3D) modeling program or correspond to new virtual components that do not yet exist in the 3D modeling program for which new XD object models require creation in the 3D modeling program. At Step 342, the collected plural information is sent in real-time from the mobile network device via the communications network to the project management application executing from the 3D modeling program on the server network device.

Method 338 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment, at Step 340, plural information is collected from the camera, bar code 98 reader, radio frequency identifier (RFID) 99 reader, location identifier component or other sensor component on the mobile target network device 12, 14, 16, 106-110, 27, 29 about plural actual physical components at a physical location 322 for a desired project. The actual physical components (e.g., 314', 318', 320', etc.) correspond to plural virtual components (e.g., 314, 318, 320, etc.) for one or more X-dimensional (XD) object models that already exist in the three-dimensional (3D) modeling program or correspond to new virtual components (e.g., 210, 290, etc.) that do not yet exist in the 3D modeling program for which new XD object models require creation in the 3D modeling program.

At Step 342, the collected plural information is sent in real-time from the mobile network device 12, 14, 16, 106-110, 27, 29 via the communications network 18 to the project management application 30'' executing from the 3D modeling program on the server network device 20.

The methods and system described herein are primarily described for 3D object models. However, the present invention is not limited to 3D object models and lower dimensional (e.g., 1D or 2D, etc.) or higher dimensional (e.g., 4D, 5D, etc.) XD object models can be used to practice the invention.

FIG. 30 is a flow diagram illustrating a Method 344 for automated building information modeling (BIM) collaboration. At Step 346, actual collaborative information is collected from an actual network device with a camera component including an actual camera point-of-view viewing cone and actual physical location information at an actual project site or virtual collaborative information is collected from a virtual network device including a virtual camera point-of-view viewing cone and actual physical location information for the actual project site in a 3D modeling program. At Step 348, the collected actual collaborative information or virtual collected collaborative information is sent in real-time from the actual network device via the communications network or virtual network device to the project management application executing from the 3D modeling program on the server network device.

Method 344 is illustrated with an exemplary embodiment. However, the present invention is not limited to the exemplary embodiment, and other embodiments can also be used to practice the invention.

In such an exemplary embodiment, at Step 346, actual collaborative information is collected from an actual network device 12, 14, 16, 106-110, 27, 29, 331 with a camera component including an actual camera point-of-view viewing cone 333 and actual physical location information 322' at an actual project site or virtual collaborative information is collected from a virtual network device 335 including a virtual camera point-of-view viewing cone 337 and actual physical location information 322'' for the actual project site in a 3D modeling program.

At Step 348, the collected actual collaborative information or virtual collected collaborative information is sent in real-time from the actual network device 12, 14, 16, 106-110, 27, 29, 331 via the communications network 18 or virtual network device 12', 335 to the project management application 30'' executing from the 3D modeling program on the server network device 20.

Method 344 is used so a person can return to a physical location (e.g., 322', 322'', etc.) and precisely find the exact physical location at which the actual network device 12, 14, 16, 106-110, 27, 29, 331 with a camera component and/or virtual camera 12', 335 was placed to re-view the exact same camera point-of-view cones 333, 337 using the GPS 322', 332'' and/or other location coordinates.

In one specific embodiment, Method 344 is practiced with UAV 27 and/or GAV 29 on project sites that are physically dangerous and/or cannot be physically accessed by a human person. However, the present invention is not limited to this embodiment and other embodiments can be used to practice the invention.

Method 344 allows a person to precisely view with the actual camera point-of-view viewing cone 333 and GPS 322' and/or location coordinates the actual physical components at a project site and compare them to virtual components in the 3D modeling program and/or view virtual components in the 3D modeling program with the virtual camera point-of-view viewing cone 337 and compare them to actual components at the project site using and GPS 322'' and/or location coordinates for the actual physical components, and all other combinations thereof of virtual and actual physical components and project sites from within and from outside the 3D modeling program.

The method and system described herein provide automated building information modeling (BIM) two-way, multi-media collaboration. Collaboration information for actual physical objects at physical locations is automatically collected and associated with virtual objects in virtual object models in a three-dimensional (3D) object modeling programs for a selected project or new virtual objects that did not previously exist are created in the 3D modeling program and associated with the actual physical objects that have physically added to a project. The method and system allows two-way real-time and static collaboration between native and new composite XD (e.g., 3D, or lower or higher dimensional) object models from within existing 3D modeling BIM programs (e.g., AUTODESK REVIT, AUTOCAD, VECTORWORKS, etc.) and the actual physical objects at the actual physical locations.

The present invention has been described for building information modeling (BIM) models and modeling programs. However, the present invention is not limited to BIM models and modeling programs and can be used for other types of modeling and design programs that are used for other types of engineering projects (e.g., airplanes, motors, engines, automobiles, ships, trains, etc.).

It should be understood that the architecture, programs, processes, methods and It should be understood that the architecture, programs, processes, methods and systems described herein are not related or limited to any particular type of computer or network system (hardware or software), unless indicated otherwise. Various types of general purpose or specialized computer systems may be used with or perform operations in accordance with the teachings described herein.

In view of the wide variety of embodiments to which the principles of the present invention can be applied, it should be understood that the illustrated embodiments are exemplary only, and should not be taken as limiting the scope of the present invention. For example, the steps of the flow diagrams may be taken in sequences other than those described, and more or fewer elements may be used in the block diagrams.

While various elements of the preferred embodiments have been described as being implemented in software, in other embodiments hardware or firmware implementations may alternatively be used, and vice-versa.

The claims should not be read as limited to the described order or elements unless stated to that effect. In addition, use of the term "means" in any claim is intended to invoke 35 U.S.C. .sctn. 112, paragraph 6, and any claim without the word "means" is not so intended.

Therefore, all embodiments that come within the scope and spirit of the following claims and equivalents thereto are claimed as the invention.

* * * * *