[go: up one dir, main page]

WO2013048427A1 - Management system with versatile display - Google Patents

Management system with versatile display Download PDF

Info

Publication number
WO2013048427A1
WO2013048427A1 PCT/US2011/054141 US2011054141W WO2013048427A1 WO 2013048427 A1 WO2013048427 A1 WO 2013048427A1 US 2011054141 W US2011054141 W US 2011054141W WO 2013048427 A1 WO2013048427 A1 WO 2013048427A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
display
building automation
automation system
processing circuit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2011/054141
Other languages
French (fr)
Inventor
Hanspeter Grossele
Renato Colcerasa
Claus Knapheide
John A. FARAGOI JR.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Siemens Corp
Original Assignee
Siemens AG
Siemens Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG, Siemens Corp filed Critical Siemens AG
Priority to CN201180073781.1A priority Critical patent/CN103827758B/en
Priority to PCT/US2011/054141 priority patent/WO2013048427A1/en
Priority to US13/538,275 priority patent/US8933930B2/en
Priority to US13/537,975 priority patent/US9542059B2/en
Priority to US13/538,073 priority patent/US8854202B2/en
Priority to US13/537,911 priority patent/US20130086066A1/en
Priority to US13/538,242 priority patent/US9519393B2/en
Priority to US13/538,182 priority patent/US9170702B2/en
Priority to PCT/US2012/057063 priority patent/WO2013049031A1/en
Priority to PCT/US2012/057240 priority patent/WO2013049138A1/en
Publication of WO2013048427A1 publication Critical patent/WO2013048427A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31472Graphical display of process

Definitions

  • the present invention relates to building automation systems, and more particularly, to user interfaces for building automation systems that allow for monitoring and control of building automation system devices.
  • Building control systems encompass a wide variety of systems that aid in the monitoring and control of various aspects of building operation. Building control systems include security systems, fire or life safety systems, lighting systems, and comfort systems, sometimes referred to as heating, ventilation, and air conditioning (“HVAC”) systems. In large commercial and industrial facilities, such systems have an extensive number of elements and are highly automated.
  • HVAC heating, ventilation, and air conditioning
  • a comfort or HVAC system typically includes large numbers of temperature sensors and ventilation damper controls, as well as other elements, which are located in virtually every area of a facility.
  • a security system may have intrusion detection, motion sensors and alarm actuators dispersed throughout an entire building or campus.
  • Fire safety systems also include widely dispersed devices in the form of smoke alarms, pull stations and controllers.
  • building control systems typically have one or more centralized control stations in which data from the system may be monitored, and in which various aspects of system operation may be controlled and/or monitored.
  • the control station typically includes a computer having processing equipment, data storage equipment, and a user interface.
  • building control systems often employ multi-level communication networks to communicate operational and/or alarm information between operating elements, such as sensors and actuators, and the centralized control station.
  • control stations In older systems, control stations provided building control data in a cumbersome, text- oriented format. The control station presented data in a way that typically required intimate system knowledge to interpret and understand. As building control systems become more complex, it has become increasingly advantageous to present building system data in a more intuitive manner. To address this issue, control stations of building control systems now generally employ graphical user interfaces that combine text information with representative graphics to illustrate the context of the system data being displayed. Graphics can include graphically displayed maps, floor plans, diagrams of complex equipment, and even graphic display of controlled or sensed values.
  • thermometer shaped graphic to represent a temperature reading, as opposed to a simple text value.
  • alarm status for a floor of building may be represented on a graphical display of the building floor plan, as opposed to a simple text list of alarm locations.
  • At least some embodiments of the present invention address the above -need as well as others, but developing an interactive interface in which when an object is selected by a user, links to related information are automatically generated based on the object, and made available for selection by the user, preferably presented in a multi-area display.
  • a first embodiment is an arrangement for use in a building automation system that includes a memory, a display, a user input device and a processing circuit.
  • the arrangement also includes at least a first building automation system device.
  • the memory stores programming instructions, and a plurality of data records corresponding to building automation system objects.
  • the processing circuit is operably coupled to the memory, the input device, and the display.
  • the processing circuit is configured, when executing the programming instructions, to obtain a data record corresponding to the first building automation system object, and display information regarding the first building automation system object using the data record in a first portion of the display.
  • the processing circuit is further configured to add information corresponding to at least one property in the data record to a set of related objects.
  • the processing circuit is also configured to identify system data associated with the first building automation system object wherein the system data is other than the data record, and to add information regarding the identified system data to the set of related objects.
  • the processing circuit is further configured to display information representative of the set of related objects in a second area of the display,
  • the first building automation system device operably coupled to the processing circuit.
  • the processing circuit is further configured to provide signals altering the operation of the first building automation system device.
  • Fig. 1 is a functional block diagram of the global management system in use with BAS (HVAC) devices, fire safety devices, and security system devices;
  • HVAC BAS
  • Fig. 1 A is functional block diagram of a computing device that carries out the functions of the global management system of Fig. 1, as well as a graphical user interface according to embodiments of the invention;
  • Fig. 2 is a representative block diagram of a screen display generated by the graphical user interface function carried out by the computing device of Fig. 1A;
  • Fig. 2A shows a screen capture of the exemplary screen display of Fig. 2 populated by data for a specific building system
  • Fig. 3 illustrates a flow diagram of an exemplary set of operations executed by a processing unit as part of the graphical user interface function
  • Figs. 4A and 4B illustrate in further detail a first embodiment of the operations of Fig. 3 executed by a processing unit as part of the graphic user interface function;
  • Fig. 5 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to generate a display element within the operations of Figs. 4A, 4B;
  • Fig. 6 illustrates in further detail a set of operations performed by a processing unit to carry out one of the operations of Fig. 5 in a first embodiment.
  • Fig. 7 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to generate related item information for display in accordance with the operations of Figs. 4A and 4B.
  • Fig. 8 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to determine relationships between objects within a building automation system
  • Fig. 9 illustrates a representative diagram of an exemplary output file generated by the process of Fig. 8.
  • Fig. 10 illustrates a representative diagram of an exemplary data image stored in memory of the system of Figs. lAnd 1A;
  • Fig. 11 illustrates a representative diagram of an exemplary configuration database of the system of Figs. lAnd 1A.
  • Fig. 12 illustrates a representative diagram of elements of an application framework for a user interface function according to an embodiment of the invention.
  • Fig. 1 illustrates functional blocks of a first embodiment of global management system 100 implemented in connection with comfort system (HVAC) devices 102, life safety devices 104, and security system devices 106.
  • HVAC comfort system
  • the management system 100, comfort system devices 102, life safety devices 104, and security system devices 106 form a comprehensive building system 50.
  • the comfort system devices 102 preferably cooperate to form a building comfort system
  • the life safety system devices 104 cooperates to form a building life safety system
  • the security system devices 106 cooperate to form a building security system.
  • the management system 100 allows for such diverse systems and devices to be managed, monitored and controlled from a single point, and in a uniform manner.
  • FIG. 1 the management system 100 is shown in functional blocks of that are representative of executed software programs and stored data.
  • a block diagram of a computer device 150 implementing the management system 100 of Fig. 1 is shown in Fig. 1A, and discussed further below.
  • the management system 100 includes an application unit or application framework 110, a core engine 112, and a data repository 114.
  • Fig. 1A shows the application framework 110, the core engine 112 and data repository 114 disposed on a single computer workstation. It will be appreciated, however, that any or all of the application framework 110, the core engine 112 and data repository 114 may suitably be distributed on separate computing devices.
  • the application framework 110 is a collection of software and associated data files that enable a client application.
  • the application framework 110 enables a system manager application that provides a user interface for monitoring, reviewing and controlling various points and devices in the system 50.
  • the application framework 110 includes, among other things, a main executable 117, a user layout definition file 118, a set of rules 119, a common graphics control module 120, and an infrastructure interface 121.
  • the core engine 112 includes a model/repository 124, a number of software extensions 126i ...126 p , a control manager 128, and device interface stacks 129.
  • the data repository 114 includes, among other things, a historical database 130.
  • the model/repository (MR) 124 includes a data server 124i A nd a system database 124 2 .
  • the system database 124 2 includes, among other things data model of all data points and all (or most) devices and other objects in the system 50.
  • each value of an active building system can be referred to as a point or data point.
  • objects of the system 50 include anything that creates, processes or stores information regarding data points, such as physical devices (BAS controllers, field panels, sensors, actuators, cameras, etc.), and maintained data files such as control schedules, trend reports, defined system hierarchies, and the like.
  • the system database 124 2 includes, among other things, current values of the various points in the system 50, and configuration information for the various objects in the system 50.
  • the MR 124 is the mechanism by which application framework 110, as well as other applications, can access data generated by the various system devices 102, 104, and 106, and provide data (i.e. commands) to such devices.
  • one type of objects maintained in the system database 124 2 consists of hierarchy definitions that identify relationships between objects in the system. These relationships are preferably hierarchical, as will be discussed further below.
  • hierarchy definitions that identify relationships between objects in the system. These relationships are preferably hierarchical, as will be discussed further below.
  • a system may define an object "floor” with multiple child objects in the form of "rooms". Each "room” object, in turn, may have several child objects such as "ventilation damper”, “smoke detector”, and "temperature sensor”.
  • Such hierarchy definitions among objects are conventional in nature, and may take many forms.
  • the use of hierarchical files in the system 100 allows for technicians to define nearly any desirable hierarchy, the result of which is stored as one of the defined hierarchical files, discussed further below.
  • the MR 124 maintains files (i.e. objects) identifying different versions of hierarchies between the objects of the system, including those representative of the devices 102, 104, 106.
  • the software extensions 126i ...126 p are sets of software services that provide core operations of the management system 100 via the model repository 124.
  • the software extensions 126i ...126 p are preferably composed in source code, compiled and linked in a manner known in the art.
  • the software extensions 126i ...126 p may suitably include a print manager, a reporting subsystem, and a status propagation manager.
  • a reporting subsystem is a system that manages the acquisition of data values from the MR 124 for the generation of various reports. Such reports may include, for example, trends for a temperature of a room or the like.
  • the generation of reports and methods of managing the same using a data model such as the MR 124 is conventional and the details thereof is omitted for clarity of exposition.
  • the status propagation manager is a device that propagates alarm status information, among other things, to various other data objects in the system.
  • An example of a suitable alarm propagation system is provided in U.S. Patent Application Serial No. 12/566,891, filed September 25, 2009, which is assigned to the assignee of the present invention and is incorporated by reference herein.
  • the control manager 128 is another software service that enables use of system data via the MR 124. Specifically, the control manager 128 facilitates use of high level scripts to provide services within the management system 100. In other words, in contrast to the software extensions 126i ...126 p , the control manager provides an execution environment for high level scripts. In particular, the control manager 128 is configured to execute software scripts that perform various services. In this embodiment, the control manager 128 executes a script to carry out the scheduling function of the management system 100.
  • the scheduling function is used to control points in the various systems based on a time-based schedule. For example, the scheduling function may be used to command temperature set points based on the time of day, and the day of the week, within the comfort system devices 102.
  • control manager 128 allows for functionality to be added to the management system 100 via scripts, as opposed to low level source code that must be compiled and linked.
  • the interface stacks 129 are a set of functional modules that act as an interface between the core 112 and the various comfort system devices 102, the various life safety devices 104, and the various security system devices 106.
  • the comfort system devices 102 can suitably include field controllers, actuators, sensors, and other devices normally associated with HVAC systems, all of which may communicate on a suitable network, as is known in the art.
  • the life safety system devices 104 can suitably include notification appliance circuits (NACs), NAC control panels, other controllers, and other devices normally associated with fire safety and/or life safety systems, all of which may communicate on a suitable network, as is known in the art.
  • the security system devices 106 can suitably include field controllers, cameras, sensors, and other devices normally associated with security systems, all of which may communicate on a suitable network, as is known in the art.
  • One or more of the devices 102, 104And 106 may operate on a specific network protocol, such as BACnet or LonTalk.
  • the interface stacks 129 provide access to data within such network protocols by services within the management system 100.
  • the application framework 110 is an application and other software components that cooperate to provide a multi-area or multi-pane display window on a computer display, such as the display 156 of Fig. 1A discussed further below.
  • the multi-pane display (see e.g. Figs. 2 and 2A) include an objection selection area 215, primary and context display areas 220, 225 for displaying information about a selected object and elements within the object, and an automatically generated "related items" area 230 that displays other objects that bear a predefined relationship to the selected object. Many or most areas have selectable links to additional information.
  • Fig. 12 shows in further detail the application framework 110. With reference to Figs.
  • the infrastructure 121 is a software module that acts as an interface, similar to an application programming interface (API), to the various elements of the core engine 112, including the MR 124.
  • the main executable 117 includes programming instructions that coordinate the actions of the other modules of the application framework 110.
  • the common graphic controls 120 include graphic libraries for various objects and points.
  • the graphics controls 120 may include a thermometer graphic definition for temperature sensors, or a
  • the graphics controls 120 are common files (or copies of common files) that are used by multiple applications, and not just the application framework 110.
  • the layout 118 is a file that defines the display format, such as whether graphical displays, text displays, camera images or the like are to be displayed. In the embodiment described herein, the layout 118 is defined in connection with each user's profile, or in the alternative, with an authorization level or other user selection. Thus, the appearance of the display can vary from user to user, even though such users employ the same application framework 110.
  • the rules 119 define how the application framework 110 populates the various areas of the display based on the defined layout 118. It will be appreciated that while the layout 118 can vary by user, the rules 119 do not. However, in contrast to the common graphic controls 120, the rules 119 are specific to the application framework 110.
  • the display screen 200 includes a first window 202, a second window 204And a third window 205.
  • the first window 202 includes a selection area 215, also referred to as a system browser, a primary work area 220, a contextual work area 225, a related items area 230, and a secondary work area 235.
  • the second window 204 includes alarm notification area 210.
  • Other windows may be incorporated.
  • the alarm notification area 210 may suitably appear as, and be generated as, shown in European Patent Specification EP 1515289 Bl, which is assigned to the Assignee of the present invention and is incorporated herein by reference.
  • the third window 205 may include details of certain events, and is also outside the scope of the present disclosure.
  • the alarm notification area 210 includes a plurality of icons 210i, 210 2 , 210 3 , 2 I O4, 2 I O5, 210 6 and 210 7 , each indicating a type of fault or alarm, and the quantity of the given fault in the current state of the system.
  • the icon 210i illustrates that five severe conditions exist
  • the icon 210 2 shows that five alarm conditions exist
  • the icon 210 7 shows that six advisory notifications exist.
  • the user may drill down to each type of notification by selecting one of the icons 210i, 210 2 , 210 3 , 210 4 , 210 5 , 210 6 and 210 7 .
  • the details of the operation of the alarm notification area 210 are beyond the scope of the present disclosure.
  • the selection area 215 includes a hierarchical list 218 of objects (e.g. objects 218i, 218 2 , 218 3 , and 218 4 ) for a building campus.
  • the hierarchical list 218 is based on a hierarchical definition file stored in memory, as will be discussed further below.
  • the hierarchy logic employed in the list 218 may take a plurality of different forms. In the example of Fig. 2A, the hierarchy logic is geographical, or space-based. Accordingly, the first or highest level of the hierarchical list 218 includes buildings such as "Headquarters" and "Main Building". The next highest or second level of the hierarchical list 218 includes floors and/or large areas of the buildings, such as "Floor 1", "Floor 2",
  • the third or next highest level of the hierarchy includes rooms and/or smaller divisions of the floors/large areas of the second level.
  • the icon 218 4 is a room “Room 002", which is part of the sub-list for
  • icon 218 3 (sometimes referred to as a "child” of) icon 218 3 , which represents "Floor 4".
  • the icon 218 3 is part of the sub-list for (or childe of) icon 218 2 . which represents the object "Main Building”.
  • This particular hierarchical string illustrates that "Room 002" is a child of "Floor 4", which in turn is a child of "Main Building”.
  • the user may select any object from the selection area 215.
  • the system 100 thereafter causes the various areas 220, 225, 230 to be populated with data corresponding to the selected object.
  • the rules 119 of the application framework 110 in conjunction with the layout 118 cooperate to define the appearance of the display elements in each of the areas 220, 225, and 230, as well as the selection area 215 and the other windows 204, 205.
  • the primary work area 220 includes information relating directly to the selected object from the selection area 215. As shown in Fig. 2A, the icon 218 4 for the Room 002 has been selected, and primary work area 220 shows a perspective floor plan 222 for the selected object "Room 002".
  • the primary work area 220 can alternatively display textual data, drop down lists, and even a document (e.g. a pdf formatted file). As will be discussed further below in detail, the format of the data that is presented in the primary work area 220 will depend upon, among other things, the layout file 118 of the application framework.
  • the contextual work area 225 is an area of the display 200 that contains information regarding a specific element within the primary work area 220. For example, if the primary work area 220 includes several selectable icons or menu entries, then the contextual work area 225 is used to provide further information relating the user selection from the primary work area 220. In Fig. 2, for example, the primary work area 220 includes selectable icons 211 lA nd 211 2 . If the user were to select one of the icons 211 2 , then the contextual work area 225 would provide further information regarding the object associated with the icon 211 2 . By contrast, in the example of Fig. 2A, the primary work area 220 does not include selectable icons.
  • the contextual work area 225 is used to simply provide more information about the Room 002.
  • the contextual work area 225 in Fig. 2A shows the properties of the object 222 shown in the primary work area 220.
  • the contextual work area 225 provides an ability to "drill down" on a select element (e.g. elements 21 or 211 2 of Fig. 2) displayed in the primary work area 220, or, as shown in Fig. 2A, provide further information about object's properties that cannot be shown in the format of graphic of the primary work area 220.
  • the related items work area 230 is a portion of the display screen 200 that includes selectable icons corresponding to other "objects" of the system 100 that relate in a predetermined manner to the selected icon within the primary work 220.
  • the related items area 230 of Fig. 2 includes selectable icons 23 , 232 2 , and 232 3 .
  • the icons 23 , 232 2 , and 232 3 link to objects related in some manner to the object represented by whichever of the icons 2111 or 211 2 is selected.
  • the related objects that can include schedules that affect or involve the selected object, reports relating to the object, and other objects identified in properties for the selected object. Further details regarding the generation of the related items area is provided below in connection with Figs. 4-7.
  • the related items icons 2311 , 231 2 and 231 3 comprise links to information about objects related to the object of the selected icon 211 2 .
  • the primary work area 220 does not have selectable icons
  • the related items area 230 includes selectable icons for objects related to the Room_002 of the primary work area 220.
  • the related items can include floor plan graphics for areas close to or involving the object Room 002, as well as a set of reports relating Room 002.
  • the secondary area 235 is an area in which information pertaining to a second object selected may be displayed.
  • the system 100 allows the user to select an object from the related items area 230. Information related to the selected related item is then displayed in some cases in the secondary area 235.
  • the areas 215, 220, 225, 230 and 235 can be re-sized by the user.
  • the user may adjust the relative sizes of the various areas. Suitable graphic tools that facilitate such scalability are known in the art. Accordingly, for example, the secondary area 235 may be collapsed completely to maximize the primary work area, as shown in Fig. 2A.
  • the areas 215, 220, 225, 230 and 235 also employ standard scrolling tools.
  • standard scroll bars are employed to allow the user to maneuver to different information within the corresponding area.
  • Fig. 2A shows a standard vertical scroll bar 255 for the contextual work area 225 and a standard vertical scroll bar 260 for the related items area 230.
  • the vertical scroll bar 255 allows the user to access further information that is currently hidden in the area 225.
  • the vertical scroll bar 260 allows the user to access additional related items information in the area 230.
  • horizontal scroll bars may similarly be used when warranted.
  • the application framework 110 when executed by a suitable computer processing circuit, facilitates user interaction through the display screen 200 of Figs. 2 and 2 A.
  • Other applications may be used to facilitate other actions by users of the system 100.
  • the application framework 110 provides for different appearances of the display elements that appear in the windows 202, 204And 205 based on a user profile, or an authorization level of a user.
  • the layout file 118 contains the display format specific to the user.
  • Fig. 12 shows the application framework 110, and specifically, the layout file 118, in further detail.
  • the layout file 118 defines a display format including a plurality of window definitions 1205i, 1205 2 , a plurality of panes or area definitions 1210 1 , 1210 2 within the one or more of the windows 1205i, 1205 2 , and a plurality of snap-in tools 1220i...1220 m employed within the window and/or area definitions.
  • the window definitions 1205i, 1205 2 define the appearance of the windows, e.g.
  • each window definition 1205 x there may be identified one or more areas 1210 x , and/or one or more snap-in tools 1220 y .
  • the area definitions 1210 1 , 1210 2 define the appearance of areas within the windows defined by the definitions 1205i, 1205 2 , for example, the areas 220, 225 and 230 of Fig. 2.
  • the snap-in tools 1220i...1220 m provide the actual format of the display elements within each of the areas 1210 1 , 1210 2 , and/or windows 1205i, 1205 2 .
  • a snap-in tool is a software script or program that generates a predetermined layout of data, menus, graphic controls, and the like.
  • the snap-in tools 1220 1 ... 1220 m are configured such that, when the layout file 118 identifies a snap-in tool 1220 y for display of a particular set of object data, the snap-in tool 1220 y (when executed by a processor) displays the object data in a predetermined layout.
  • a first snap-in tool 1220 1 may be used to generate graphic floor plan views, such as that shown in Fig. 2A in the primary work area 220 of Fig. 2A.
  • Another snap-in tool 1220 2 may be used to generate ordered sets of text data.
  • Another snap-in tool 1220 3 may be used to generate an arrangement of dialog boxes and other interactive widgets or elements, such as those shown in the contextual work area 225 of Fig. 2A.
  • Still another snap-in tool 1220 m may be used to display video data from a camera object.
  • the same snap-in tools 1220 y may be used in multiple panes or areas 1210i, 1210 2 and multiple areas 1205i, 1205 2 .
  • selectable tabs such as tabs 224 ls 224 2 and 224 3 of Fig. 2.
  • the snap-in tools 1220 1 ...1220 m in this embodiment are configured to generate displays by accessing various properties of objects.
  • the snap-in tools 1220 1 ...1220 m are modular library tools that may be implemented by any running instance of the application framework 110.
  • the layout file 118 may identify that a particular data "object" is to be displayed using a particular snap-in tool 1220 x .
  • the snap-in tool 1220 x generates the display by accessing predetermined sets of properties of the data object, and then using that data to construct details of the display. Further detail regarding the operation of the snap-in tools 1220 1 ... 1220 m is provided below in connection with Figs. 4A, 4B, 5 and 6.
  • Fig. 1A shows an exemplary embodiment of the management system 100 implemented in a commercially available general purpose computer 150.
  • the management system 100 includes a central processing unit and associated support circuitry (CPU) 152, a plurality of network input/output (I/O) units 154i ... 154 r , a display 156, a user input device 158, a primary memory 160, and a secondary storage device 162.
  • the CPU 152 is configured to execute programming instructions stored in the memory 160 to carryout various operations as described herein.
  • the CPU 152 is configured to receive data inputs from the user input 158, generate display screens to be displayed on the display 156.
  • the CPU 152 is also configured to communicate with external devices, such as the system devices 102, 104, 106, via one or more of the network I/O units 154i ...154 r .
  • the CPU 152 is operably connected in a conventional manner to each of network I/O units 154i ...154 r , the display 156, the user input 158, the primary memory 160, and the secondary storage 162 via a system bus
  • the primary memory 160 stores programming instructions for the application framework 110, the extensions 126i ...126 p , the control manager 128, and software elements of the stack interface 129.
  • the primary memory 160 also stores the elements of the model/repository, which includes the data server 124i A nd the data base 124 2 .
  • the primary memory 160 may include volatile memory such as random access memory, as well as other types of readable and writeable memory.
  • the database 124 2 is a database containing active system values and data, as well as configuration data for the elements of the system.
  • Fig. 10 shows a functional diagram of contents of the database 124 2 .
  • the database 124 2 includes present (or latest) values 1005 for the various points of the system 50, including values (e.g. temperatures, set points, fan speed, etc.) of the devices 102, 104And 106.
  • the database 124 2 also includes alarms or notifications 1010 and their corresponding statuses.
  • the database 124 2 further includes schedule files 1015 identifying control schedules. As discussed above, the schedules define a set of timed command values to be communicated to various elements in the systems 102, 104And 106.
  • a schedule may command the comfort system to employ one set of temperature set points during work hours, and another set of temperature set points on evenings and weekends.
  • the schedules 1015 are in the form of scripts that are implemented by the control manager software 128. However, it will be appreciated that in other embodiments, the schedules 1015 may be implemented as a software component and set of corresponding schedule data files.
  • the database 124 2 further stores user profile information 1020.
  • the user profile information 1020 includes, for each authorized user, a specific layout file that is to be used as the layout file 118 when that user runs the application framework 110.
  • the database 124 2 also includes hierarchical files 1025 defining one or more sets of hierarchical relationships between data objects within the system.
  • the "objects" of the system 50 may be defined within a hierarchy. These "objects" can include various devices 102, 104, 106
  • hierarchical files 1025 can identify hierarchical relationships between buildings, devices, and even schedules and reports.
  • the database 124 2 also includes object configuration data 1030.
  • Object configuration data 1030 includes a data record for each object of the system.
  • each room, floor, building, sensor, camera and field controller has its own object configuration data record.
  • Fig. 11 shows in further detail a representative diagram of the maintained in the database 124 2 .
  • the object configuration data 1030 includes a set of data object records 1105 associated with each of the devices 102, 104And 106 in the system 50, and a set of data object records 1110 associated with each room, space and building in which the system 50 is located, among other things.
  • the objection configuration data 1030 may further includes object records associated with other logical entities, such as reports, not shown.
  • Each object record 1105, 1110 includes a set of predetermined properties, including unique identifying information ⁇ ID> and an object type ⁇ OBJECT_TYPE>.
  • object type may be "sensor”, “controller”, “floor”, “room”, “hierarchy” etc.
  • the number and type of properties of each object record 1105, 1110 depends on the object type.
  • Each object record 1105, 1110 may also contain one or more point properties ⁇ POINT> identifying a point value corresponding to the object.
  • points are used to describe operating values of the system, such as temperatures at particular sensors, set points for various actuators or air handling units, and the like.
  • Each object may be associated with one or more points.
  • the same point may be associated with a plurality of objects.
  • the point T 32 which may represent a temperature sensed by a sensor TEMP S 02 that is located within ROOM 002 may be a point property for both the object record 1 105 for TEMP S 02 and the object record 1 1 10 for ROOM 002. (See Fig. 1 1).
  • the object records 1 105, 1 1 10 may suitably have many other properties, including references to a graphic element, a pdf document, manufacturing information, maintenance information, and the like.
  • the object records 1 105, 1 1 10, further include a related items property ⁇ RI> that identifies related items for the objects represented by records 1 105, 1 1 10.
  • the related items for an object can include a reference or link to a video image associated with the object (i.e. from a video camera in the room ROOM 002), a trend report associated with the object, and the like.
  • the system database 124 2 is operably accessed and maintained by the data server 124 1 .
  • the data server 124i is a software program that (when executed by the CPU 152, manages the data in the system database 124 2 , including the management of the service that obtains system data from the devices 102, 104And 106, and communicates changes or commands to the devices 102, 104And 106.
  • the secondary storage 162 which may suitably be non- volatile storage, stores the system historical data 130 and other reference information, such as a pdf document library 168.
  • the document library 168 may suitably be a set of pdf files that are associated with various of the devices 102, 104And 106. It will be appreciated that the secondary storage 162 may also store other files typical of a building control system, such as, for example, the historical database 130.
  • the CPU 152 executes the operations of the software elements 1 10, 124 l s 126- i . .. 126 p , 128 and 129 to perform the operations of management system 100 as described herein. Specifically, the CPU 152 performs the operations of Figs. 3, 4A, 4B, 5, 6 and 7, as discussed further below, to carry out the a system manager application of the application framework 1 10. The CPU 152 may also suitably perform the operations of Fig. 8, discussed further below. Before discussion of the specific operations of the system 100 of Figs. lAnd 1A, the general operation of the system 50 will be described. In the general operation of the system 50, the comfort system devices 102 operate to provide heating, ventilation and air conditioning to the building in accordance with normal practices using any suitable conventional techniques.
  • life safety devices 104 operate to provide monitoring for, and if necessary, notification of, a hazardous condition such as fire, smoke or toxic gas release.
  • security system devices 106 operate to provide motion sensing, video surveillance information, and door position monitoring, and the like in accordance with normal security system practices.
  • the CPU 152 employs the data server 124i to exchange data with at least some of the devices 102, 104And 106 (directly or indirectly) via the interface stack software 129 and network I/O units 154i ...154 r .
  • the CPU 152 maintains the system database 124 2 based on, among other things, the received data from the devices 102, 104And 106.
  • the CPU 152 conveys command values from various elements in the management system 100 to the various devices 102, 104, 106, also via the interface software 129 and network I/O units 154i...154 r .
  • the CPU 152 may communicate scheduled commands to various devices 102, 104And 106 via the interface stack software 129 and the network I/O units 154 1 ...154 r .
  • Fig. 3 shows in general terms a process flow of an exemplary set of operations of the CPU 152 executing the user interface application framework 110.
  • Figs. 4A, 4B, 5, 6 and 7, discussed further below, show in further detail how the operations of Fig. 3 may be carried out.
  • the CPU 152 receives a user input signal identifying a first of the plurality of building automation system objects to be displayed.
  • the CPU 152 may receive a selection from a plurality of selectable objects in the selection area 215 of Fig. 2 or Fig. 2A.
  • the CPU 152 obtains a first set of object data regarding the selected building automation system object from one or more data records associated with the first building automation system object.
  • the CPU 152 may suitably obtain configuration data for the selected object (i.e. data records 1105, 1110 of Fig. 11) from the configuration object data 1030 of the database 124 2 , and obtain system values 1005 (see Fig.
  • the first set of object data may suitably include selectable links to other objects, such as child objects of the selected object.
  • the select object is a floor of a building
  • the first set of building data may include a graphic for that floor, and links to temperature sensors located on that floor.
  • the CPU 152 displays (via the display 156) information regarding the first set of object data in the primary work area 220 of the display.
  • the CPU 152 may suitably display a graphic depicting or representing the object, or values associated with the object, in the primary work area 220 of Fig. 2.
  • the CPU 152 displays a graphic of the selected object room 002.
  • the CPU 152 reviews system data, including dynamic data, to determine a set of related objects corresponding to one or more elements of the first set of object data. For example, the CPU 152 may review schedule files or other files to determine whether any schedules implicate or relate logically to the selected building automation system object, or some child object of the selected building automation system object. The CPU 152 may execute step 320 contemporaneous to, before, or after step 315.
  • step 325 the CPU 152 displays information regarding the set of related objects in another portion of the display, while the information in the primary work area remains displayed. For example, referring to Fig. 2A, the CPU 152 may display selectable icons for the related objects in the related items area 230 while the graphic 222 remains displayed in the primary work area 220.
  • the above steps provide a functionality in which the user not only receives information relevant to a selected building automation system object, but further receives icons identifying selectable additional objects.
  • Related items may be identified from the related items property ⁇ RI> of the objection configuration data 1030 (see Fig. 11).
  • the related items may be dynamically determined based on system data, such as schedules, reports or the like. This provides the user with more options to navigate intuitively through the system 100.
  • Figs. 4 A and 4B show in further detail an exemplary embodiment of the operations of Fig. 3.
  • the CPU 152 in step 402 receives a request to start the application framework 110 via the user input 158.
  • the request input includes user login information, such as name and password or other authentication information.
  • the CPU 152 determines whether the user authorization value is valid. If not, the CPU 152 terminates operations of Fig. 4, or returns to step 402 to prompt for a new request. However, if the CPU 152 determines that the user authorization level corresponds to the required authorization value, then the CPU 152 proceeds to step 405.
  • step 405 the CPU 152 instantiates the application framework 110 as an operating execution sequence. To this end, CPU 152 obtains the layout file 118 for the user from the corresponding user profile 1020. This user profile 1020 for the user identifies the snap-ins 1220i...1220 m for the various windows 202, 204, 205, and for the various areas 215, 220, 225, 230 and 235 of the window 202. As discussed above, different user profiles may identify different snap-in tools for each of the windows and areas, and the same snap-in tools may be identified for multiple of the areas. The CPU 152 then continues (via the main executable 107) in step 406.
  • the CPU 152 receives, via the user input 158, a request to review a specific system or hierarchy file 1025 (see Fig. 10). To this end, the CPU 152 obtains the selected hierarchy file 1025 from the database 124 2 .
  • the user may request to retrieve a geographical hierarchical file of a specific building campus, such as the one illustrated in the selection area 215 of Fig. 2A.
  • a first of the hierarchy files 1025 may define a geographical hierarchy, such as the hierarchy shown in the selection area 215 of Fig. 2A.
  • a second of the hierarchy files 1025 may define a mechanical hierarchy, for example, corresponding to the flow path of chilled or heated air or water through the system 50.
  • a "building" object on may be associated with several "child” air handling unit objects.
  • Each of the air handling units may in turn be associated with a plurality of "child” objects for ventilation dampers.
  • Other hierarchies may be defined for any given building automation system, and would be a matter of design choice.
  • the user may in some cases select from a plurality of defined hierarchy files 1025 that include any or all of the data objects, including but not limited to those associated with building spaces and automation system devices 102, 104And 106.
  • the user in this embodiment is limited to a set of hierarchies based on the user's authorization level.
  • the CPU 152 in step 408 generates a default object selection value for generation of the initial display.
  • the default object selection value can identify one of the objects of the selected hierarchy.
  • the default value may suitably comprise the highest object in the selected hierarchy.
  • the CPU 152 sets the default objection selection value may be set to null, in which case, nothing object information is displayed until a user choice is made. In either event, the selection value CUR OBJ is set equal to the generated default object selection value.
  • step 410 the CPU 152 displays, in the selection area 215 of the display screen 200 on the display 156, the hierarchy defined by the selected hierarchy file 1025.
  • the CPU 152 also enables each of the objects identified on the displayed hierarchy to be selectable using the user input 158.
  • the CPU 152 allows the user to select any of the list items of the hierarchical list 218 of Fig. 2A.
  • the CPU 152 employs the rules 119 to cause the selection area 215 to display the hierarchical information in the hierarchical file 1025 using the identified snap-in 1220 x for the selection area 215 as defined in the layout file 118 (See. Figs. 2, 10 and 12).
  • step 412 the CPU 152 determines whether it has received an input from the user input device 158 identifying a new user selection in the selection area 215. If so, then the CPU 152 proceeds to step 414. If not, then the CPU 152 proceeds directly to step 416. In step 414, the CPU 152 sets CUR OBJ equal to the user selection. After step 414, the CPU 152 proceeds to step 416.
  • the CPU 152 populates the primary work area 220 of the display screen 200 based on current object CUR OBJ, the layout file 118, and system data from the database 124 2 .
  • the graphic and/or text information presented in the work area 220 can have several different types of appearance, ranging from a graphic with or without interactive elements, sets of values with dialog boxes for changing the values, selectable text icons within selectable drop down menus, simple text lists or tables, pdf image documents, and/or live video feed.
  • the display format is determined by the layout file 118 (see Fig. 12), and in particular, the one or more snap-in tools 1220 x identified within the layout file for the primary work area 220.
  • the snap-in tools 1220 x define the format of the display, such as graphical, textual, video, and/or arrangements of pull-down menus and dialog boxes, the content or values within the display element depend on configuration data and/or system data (from the data image 124 2 ).
  • Fig. 5 shows in further detail a set of operations employed by the CPU 152 to generate the display element in any window or work area, such as in the primary work area 220.
  • the CPU 152 employs the same set of operations to generate the display element of the contextual work area 225 and the secondary work area 235, discussed further below.
  • the CPU 152 in step 505 obtains the object selection to be displayed, OBJ, and an identification of the window/area, PANE, in which it is to be displayed.
  • the object selection OBJ will be set equal to CUR OBJ
  • the value PANE is equal to the primary work area 220.
  • the object selection Oi?Jin step 505 will be set equal to CONT OBJ
  • PANE will be set to the contextual work area 225.
  • the object selection Oi?Jin step 505 will be set equal to the selected related object, and PANE will be set equal to the secondary work area 235.
  • the CPU 152 references the layout file 118 to determine all snap-in tools 1220 x ...1220 y that are identified for the current area being generated or populated, PANE.
  • PANE is the primary work area 220
  • the area definition 1210i of the layout file 118 of Fig. 12 corresponds to the primary work area 220.
  • the CPU 152 would determine (based on the definition 1210i) that snap-in tools 1220i, 1220 2 , and 1220 3 are to be used to generate the display elements of the primary work area 220.
  • the CPU 152 then processes each of the identified snap-in tools in steps 515 to 525.
  • step 515 the CPU 152 determines, for one of the identified snap-in tools 1220 x , whether the object Oi?Jhas properties or data appropriate for the snap-in tool definition.
  • a snap-in tool 1220 x may be a video image output. If the object OBJ is a room object for a room having a video camera, then the CPU 152 would determine that snap-in tool 1220 x was appropriate for the object OBJ. If, however, the object OBJ is a temperature sensor, then CPU 152 may determine that the snap-in tool 1220 x is not appropriate for the object OBJ. In most cases, step 515 may be carried out by determining whether the configuration data record for the object Oi?Jhas properties expected by the snap-in tool.
  • step 515 if the CPU 152 determines that the object Oi?Jhas properties appropriate for the select snap-in tool 1220 x , then the CPU 152 proceeds to step 520. If not, then the CPU 152 proceeds to step 525.
  • step 520 the CPU 152 adds the snap-in tool 1220 x to the list of snap-in tools to be executed in generating the display area for PANE. Thereafter, the CPU 152 proceeds to step 525. In step 525, the CPU 152 determines whether all snap-in tools identified in the layout file 118 corresponding to the area PANE have been processed. If so, then the CPU 152 proceeds to step 530. If not, then the CPU 152 returns to step 515 to process another of the snap-in tools identified in step 510.
  • the CPU 152 In step 530, the CPU 152 generates a display element for the area PANE (e.g. primary work area 220, contextual work area 225 or related items area 230) using the primary snap-in tool on the generated list (step 520) of snap-in tools.
  • the layout file 118 further includes a prioritization of snap- in tools appropriate for each window or area. As a default, the snap-in tool on the generated list of snap-in tools with the highest priority constitutes the primary snap-in tool.
  • step 530 the CPU 152 employs the primary snap-in tool and the properties of the object OBJ to generate a display element in the area PANE. Further detail regarding the generation of a display element as per step 530 is provided below in connection with Fig. 6.
  • the CPU 152 proceeds to step 535.
  • the CPU 152 causes a selectable tab (e.g. 224i A nd 224 2 of Figs. 2 and 2A) to be displayed in the area PANE (e.g. area 220) for all other identified snap-in tools on the list generated in step 520.
  • Such tabs e.g. 224 ls 224 2
  • the primary work area 220 of Fig. 2A shows a floor plan graphic 222 of the object Room_002
  • the user may select the icon 224 2 to display a text description of the object Room_002.
  • the CPU 152 sets the primary snap-in tool equal to the that corresponding to the selected tab, and executes again steps 530 and 535. In this manner, the user is made aware of available, alternative display formats for properties and/or values of the object OBJ in the relevant area/window PANE.
  • Fig. 6 shows an exemplary operation of the CPU 152 in generating a display element for an object OBJ using a snap-in tool 1220 x .
  • the operations of Fig. 6 are generalized for all snap-in tools.
  • step 605 the CPU 152 executing a snap-in tool 1220 x obtains the object data record (e.g. data record 1105 or 1110 of Fig. 11) from the database 124 2 for the object OBJ.
  • the CPU 152 retrieves from the object data record any static properties to be used in generating the display.
  • each snap-in tool 1220 x references a set of property types used by various data objects in the system.
  • the CPU 152 retrieves the property values of OBJ for the property types required by the particular snap-in tool 1220 x .
  • step 610 the CPU 152 obtains whatever value, link or other information is stored in the ⁇ graphic> property of the data record 1105 or 1110 (see Fig. 11) for the select object OBJ.
  • the CPU 152 retrieves from the database 124 2 any dynamic operating data required by the snap-in tool 1220 x that corresponds to the selected object OBJ.
  • the object OBJ were a temperature sensor
  • the CPU 152 in step 615 may suitably retrieve from the database 124 2 (via a reference within the object data record) the temperature value sensed by the corresponding physical sensor.
  • the CPU 152 has all of the information for the display element (i.e. the graphics and/or text to be displayed in the primary work area 220) that will be generated by the snap-in 1220 x , the CPU 152 in step 620 generates the actual display element using the retrieved configuration data and retrieved operating data.
  • the generated display element may be a graphic, a text table, an interactive set of text and pull-down menus, or any typical interactive screen elements.
  • step 416 of Fig. 4A (as well as any steps involving populating an area of the display 200) may be carried out.
  • the generation/population of other display areas is carried out in an analogous manner.
  • step 418 the CPU 152 determines a default object selection for the context work area 225.
  • the context work area 225 provides select additional information regarding the selected object CUR OBJ from the selection area 215.
  • most displays in the primary work area 220 include additional links or selectable icons to other objects, such as "child" objects, or contained objects, of the selected object.
  • the CPU 152 causes such "child" objects to be selectable within the primary work area 220.
  • Icons 2111 and 211 2 of Fig. 2 illustrate examples of such selectable object icons. In such cases, the user may select additional information (drill down) by selecting the object icon or link within the primary work area 220.
  • the selected object CUR OBJ is a room
  • the display element in the primary work area 220 includes selectable icons or text boxes identifying sensors and actuators within that room.
  • the user may select one of the sensors or actuators within the primary work area 220 in order to obtain additional information regarding the sensor in the contextual work area 225.
  • This object selected within the primary work area 220 by the user is referred to herein as the contextual object, CONT OBJ.
  • the CPU 152 determines a default contextual object CONT OBJ to display in the contextual work area 225. Accordingly, in step 418, the CPU 152 determines this default contextual object CONT OBJ based on the selected object CUR OBJ and the snap-in program 1220 x used for the generation of the display element in the primary work area 220.
  • step 420 the CPU 152 populates the contextual work area 225 of the display screen 200 based on the current contextual object CONT OBJ, the layout file 118, and system data from the database 124 2 .
  • the graphic and/or text information presented in the contextual work area 225 can have several different types of appearance, ranging from a graphic with or without interactive elements, sets of values with dialog boxes for changing the values, text with selectable drop down menus, simple text lists or tables, pdf image documents, and live video feed.
  • the display element type within the contextual work area 225 depends on the object selected (CONT OBJ) and the layout file 118 obtained from user profile 1020.
  • the content and values within the display element in the contextual work area 225 depend on configuration data and/or system data (from the data image 124 2 ).
  • the CPU 152 carries out the operations of Figs. 5 and 6, similar to step 418.
  • step 422 the CPU 152 determines the related items for the related items area 230 based on the contextual object that has been selected from within the primary work area 220, in other words, the object CONT OBJ.
  • the CPU 152 furthermore displays information and/or links corresponding to the determined related items.
  • the CPU 152 preferably carries out the operations of Fig. 7.
  • the CPU 152 identifies the related items as items that bear a relationship to the object
  • the related items include any schedules on which points or data values of the object CONT OBJ appear, any existing reports for the CONT OBJ, and new reports for the object CONT OBJ.
  • the related items include static elements listed in the properties of the object configuration data for the object CONT OBJ, as well as dynamic elements such as schedules generated in subsystems of devices 102, 104, 106 and other non-property elements involving CONT OBJ.
  • step 424 the CPU 152 determines whether the user has provided, via the user input 158, a new selection of an object from within the primary work area 220. In other words, the CPU 152 determines whether it has received an input identifying a new contextual object.
  • the primary work area 220 which displays information about CUR OBJ, displays selectable icons or links to further information regarding CUR OBJ, such as "child" objects or related files. Such objects are typically defined or referenced in the configuration properties of the data object CUR OBJ.
  • the CPU 152 determines whether it has received an input selecting a link or icon from the primary work area 220.
  • step 426 the CPU 152 sets CONT OBJ equal to the new selection.
  • step 426 the CPU 152 returns to steps 420 and 422 in order to update the contextual work area 225 and related items area 230 accordingly.
  • the CPU 152 determines whether any of the related items has been selected from the related items area 230. If not, then the CPU 152 proceeds to step 436 to determine whether other inputs have been received. If so, however, then the CPU 152 proceeds to step 430 to process the selected related item.
  • step 430 the CPU 152 first determines whether a selected toggle/button is in the "on" state.
  • the selected toggle/button is a user graphic control (see toggle graphic control 238 of Figs. 2 and 2 A.) that allows the user to dictate whether information pertaining to a selected related item is to be displayed in the primary work area 220 or the secondary work area 230. If the CPU 152 determines that the toggle/button 238 is in the on-state, then proceeds to step 432. If not, then the CPU 152 proceeds to step 434.
  • step 432 the CPU 153 populates the secondary work area 235 with information related to the selected related object. Similar to the generation of displays of other objects, such as CUR OBJ and CONT OBJ, the CPU 152 in step 432 performs the operation of Figs. 5 and 6 to populate the secondary work area 235. After step 432, the CPU 152 proceeds to step 436.
  • step 434 the CPU 152 sets CUR OB J to the selected related item.
  • the CPU 152 thereafter returns to step 416 to display information related to the newly defined CUR OBJ within the primary work area 220.
  • the CPU 152 thereafter proceeds as described above after step 416.
  • the CPU 152 determines whether any user input data has been received (via input device 158) in any of the work areas, such as the primary work area 220, the secondary work area 235, and the contextual work are 225. If so, then the CPU 152 proceeds to step 438. If not, then the CPU 152 proceeds to step 440. In step 438, the CPU 152 processes the input. If the input relates to a commanded point, such as a temperature set point, a camera control, or other command value, then the CPU 152 provides the value to the data image 124 2 . It will be appreciated that the data server 124i thereafter causes the command value to be forwarded to the appropriate device of the devices 102, 104And 106 via the interface 129.
  • a commanded point such as a temperature set point, a camera control, or other command value
  • the CPU 152 causes the relevant data record relating to said schedule or report is updated. Other methods of processing input data relating to building automation systems may readily be implemented. After processing the input in step 438, the CPU 152 proceeds to step 440.
  • step 440 the CPU 152 determines whether any updates have been received to the information displayed in any of the areas 220, 225 and 235.
  • the data displayed in the areas 220, 225, and 235 may comprise or relate to active sensors, cameras or controlled devices (of the devices 102, 104, 106), the outputs of such devices can change.
  • Such changes propagate to the database 124 2 , as well as to various reporting functions and other software programs (e.g. software extension 126i...126 p and/or control manager 128).
  • the CPU 152 obtains notification of any changes to values displayed (or affecting the appearance of) any of the display elements in the area 220, 225 and 235. If the CPU 152 determines that a relevant value has changed, then the CPU 152 executes step 442. If not, then the CPU 152 proceeds directly to step 444.
  • step 442 the CPU 152 refreshes or updates the display elements in the areas 220, 225 and 235 using any changed values. To this end, the CPU 152 may simply perform the operations of Fig. 5 and 6 for each of the areas 220, 225 and 235. After step 442, the CPU 152 proceeds to step 444.
  • the CPU 152 determines whether any other inputs received via the user input device 158 need to be processed.
  • the display screen 200 may suitably include multiple other selectable features that enhance the utility of the display screen 200.
  • Such features may take a plurality of formats.
  • Such features can include selectable tabs (e.g. 224i A nd 224 2 ) for each of the work areas 220, 225 and 235, which allow the user to select from among a plurality of display formats (corresponding to appropriate snap-in tools) for the relevant object in each work area.
  • Other examples can include inputs generating a contextual work area, not shown, for the secondary work area 235. If such other inputs have been received, then the CPU 152 proceeds to step 446 to process the inputs in an appropriate manner.
  • the CPU 152 proceeds to step 446 to process the inputs in an appropriate manner.
  • step 448 If no other inputs have been received, then the CPU 152 proceeds directly to step 448.
  • step 448 the CPU 152 determines whether the user has selected a new selection from the selection area 215. If so, then the CPU 152 proceeds to step 414 to set CUR OBJ to the new value. As discussed above, the CPU 152 after step 414 executes step 416 and proceeds accordingly. If, however, the CPU 152 determines in step 448 that it has not received a new selection from the selection area 215, then the CPU 152 returns to step 424 and proceeds accordingly.
  • Figs. 4A and 4B are the determination of the related items for a selected object within the primary work area 220, and the display of selectable icons or other information corresponding thereto, in the related items area 230.
  • Fig. 7 shows an exemplary set of operations that may be performed by the CPU 152 in determining the related items corresponding to the object CONT OBJ, represented as step 422 of Figs. 4 A and 4B.
  • step 705 the CPU 152 determines all the schedule files that include any points identified with or associated with the CONT OBJ.
  • each data object record 1105, 1110 may have associated with it points or active system data values.
  • Such points may include control points, such as set points or other command values, that are adjusted according to one or more schedules defined by a schedule file.
  • a schedule file can include the schedule files 1015 in the database 124 2 , or can include schedules executing separately on one or more controller devices of the devices 102, 104 and 106.
  • the CPU 152 determines all schedules associated with the point properties of the selected contextual object CONT OBJ and displays information, such as a selectable icon, representative of those schedules in the related items area 230.
  • the CPU 152 reviews a stored output file (in the data model 124 2 ) that lists all schedule files associated with each point of the system 50.
  • the stored output file is a table of points. The table entry for each point lists a set of schedule files that correspond to the point.
  • Fig. 9 shows an exemplary relationship finder output file.
  • that output file which may suitably be stored in the primary memory 160 of the system 100
  • three points TEMP SP 03, TEMP SP 08 and CHILL PWR are listed at table entries.
  • For each of the table entries there is a set of schedule identifiers, some subset of identifies SCH. 1, SCH. 2, SCH. 3, and/or SCH. 4, that affect the point or value in question.
  • step 705 the CPU 152 determines the schedules listed in the stored output file for all point properties of the CONT OBJ.
  • the CPU 152 furthermore causes the related items area 230 to include selectable icons to links to such schedules.
  • the CPU 152 generates a link to new standard reports for the CONT OB J base on the object type of the CUR OBJ.
  • each object type has a set of predetermined standard reports, for example, trend reports, that can maintained by system 100.
  • the CPU 152 identifies the standard reports for the CONT OBJ based on its object type.
  • the CPU 152 furthermore causes the related items area 230 to include selectable icons to links to new standard reports. The user may subsequently select such icons to set up new reports involving the object CONT OBJ.
  • step 715 the CPU 152 causes a selectable icon to a link to a new, unidentified report selection to be displayed in the related items area 230.
  • the user may select this icon to generate some other report involving the object CONT OBJ.
  • step 720 the CPU 152 identifies and displays information for any static related items for the object CUR OBJ. Specifically, the CPU 152 determines static related items based the data record (e.g. 1105, 1110) for the object CONT OBJm the database 124 2 . To this end, CPU 152 reviews a predetermined set of properties in the configuration data (i.e. the related items property ⁇ RI> of object records 1105, 1110) for the CONT OBJ to identify any static related items. These static properties can, for example, include a link to a video feed from a video camera, existing reports, and the like. The CPU 152 further causes the related items area 230 to include selectable icons to links to such static related items.
  • Fig. 7 show how the CPU 152 may generate selectable related items for an object already displayed in other areas 220 and/or 225.
  • These related items include those defined as properties for the object, as well as those dynamically created based on system data (such as schedules) and/or to allow new objects (i.e. reports) to be generated.
  • Fig. 8 shows the operations that may be performed by the CPU 152 (or another processing unit) to generate a relationship finder output file that relates data points to schedules.
  • the relationship finder output file in this embodiment is a table of points. For each point in the table, a list of schedules that involve or relate to the point is stored. (See Fig. 9). The CPU 152 performs the steps of Fig. 8 to generate such a table.
  • step 805 the CPU 152 selects a schedule file from the schedule files 1015 that has not yet been processed. Alternatively, or in addition, the CPU 152 obtains one or more schedules maintained on controllers of the devices 102, 104 and 106.
  • step 810 the CPU 152 selects a point that is identified within the selected schedule, and has not yet been processed.
  • step 815 the CPU 152 determines whether a table entry exists for the selected object. If so, the CPU 152 proceeds directly to step 825. If not, then the CPU 152 in step 820 creates a table entry for the selected object, and then proceeds to step 825.
  • step 825 the CPU 152 stores an identification of the selected schedule in the table entry for the selected point. In this manner, any subsequent look-up of the selected point (per step 705 of Fig. 7) on the table will identify, among other things, the particular schedule currently being processed. After step 825, the CPU 152 proceeds to step 830.
  • step 830 the CPU 152 determines whether all points in the selected schedule have been processed. If so, then the processing of the selected schedule is complete and the CPU 152 proceeds to step 835. If not, then the processing of the selected schedule is not complete and the CPU 152 returns to step 810 to select and process another object of the selected schedule.
  • step 835 the CPU 152 determines whether all of the schedules maintained by the system 100 have been processed. If so, then the CPU 152 has completed the relationship finder output file and ends the process. If not, then the CPU 152 returns to step 805 to select and process another schedule.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

An arrangement for use in a building automation system includes a memory, a display, a user input device and a processing circuit. The arrangement also includes at least a first building automation system device. The memory stores programming instructions, and a plurality of data records corresponding to building automation system objects. The processing circuit is operably coupled to the memory, the input device, and the display. The processing circuit is configured, when executing the programming instructions, to obtain a data record corresponding to the first building automation system object, and display information regarding the first building automation system object using the data record in a first portion of the display. The processing circuit is further configured to add information corresponding to at least one property in the data record to a set of related objects. The processing circuit is also configured to identify system data associated with the first building automation system object wherein the system data is other than the data record, and to add information regarding the identified system data to the set of related objects. The processing circuit is further configured to display information representative of the set of related objects in a second area of the display. The first building automation system device operably coupled to the processing circuit. The processing circuit is further configured to provide signals altering the operation of the first building automation system device.

Description

MANAGEMENT SYSTEM WITH VERSATILE DISPLAY
Field of the Invention
The present invention relates to building automation systems, and more particularly, to user interfaces for building automation systems that allow for monitoring and control of building automation system devices.
Background of the Invention
Building control systems encompass a wide variety of systems that aid in the monitoring and control of various aspects of building operation. Building control systems include security systems, fire or life safety systems, lighting systems, and comfort systems, sometimes referred to as heating, ventilation, and air conditioning ("HVAC") systems. In large commercial and industrial facilities, such systems have an extensive number of elements and are highly automated.
The elements of building control systems are widely dispersed throughout a facility. For example, a comfort or HVAC system typically includes large numbers of temperature sensors and ventilation damper controls, as well as other elements, which are located in virtually every area of a facility. Similarly, a security system may have intrusion detection, motion sensors and alarm actuators dispersed throughout an entire building or campus. Fire safety systems also include widely dispersed devices in the form of smoke alarms, pull stations and controllers. To achieve efficient and effective building control system operation, there is a need to monitor the operation of, and often communicate with, the various dispersed elements of a building control system. To this end, building control systems typically have one or more centralized control stations in which data from the system may be monitored, and in which various aspects of system operation may be controlled and/or monitored. The control station typically includes a computer having processing equipment, data storage equipment, and a user interface. To allow for monitoring and control of the dispersed control system elements, building control systems often employ multi-level communication networks to communicate operational and/or alarm information between operating elements, such as sensors and actuators, and the centralized control station.
In older systems, control stations provided building control data in a cumbersome, text- oriented format. The control station presented data in a way that typically required intimate system knowledge to interpret and understand. As building control systems become more complex, it has become increasingly advantageous to present building system data in a more intuitive manner. To address this issue, control stations of building control systems now generally employ graphical user interfaces that combine text information with representative graphics to illustrate the context of the system data being displayed. Graphics can include graphically displayed maps, floor plans, diagrams of complex equipment, and even graphic display of controlled or sensed values.
An example of the use of representative graphics may be the use of a thermometer shaped graphic to represent a temperature reading, as opposed to a simple text value. Similarly, the alarm status for a floor of building may be represented on a graphical display of the building floor plan, as opposed to a simple text list of alarm locations.
While the use of graphics and other advanced interface features have enhanced access and monitoring of building system data, one limitation on control stations is the manner in which information about complex building operations is accessed by a technician. Maneuvering among displays of various elements of a building system control station can be a daunting task. To this end, many building systems contain thousands of points and hundreds of objects which may be communicated with, which may be monitored, and which in some cases may be controlled. Because building systems are by nature largely unique, the systems use to display and access data have historically been relatively rigid in their user interface architecture.
There is a need, therefore, for a more intuitive interface that allows for easier
maneuvering among large numbers of objects and/or points in building systems.
Summary of the Invention
At least some embodiments of the present invention address the above -need as well as others, but developing an interactive interface in which when an object is selected by a user, links to related information are automatically generated based on the object, and made available for selection by the user, preferably presented in a multi-area display.
A first embodiment is an arrangement for use in a building automation system that includes a memory, a display, a user input device and a processing circuit. The arrangement also includes at least a first building automation system device. The memory stores programming instructions, and a plurality of data records corresponding to building automation system objects. The processing circuit is operably coupled to the memory, the input device, and the display. The processing circuit is configured, when executing the programming instructions, to obtain a data record corresponding to the first building automation system object, and display information regarding the first building automation system object using the data record in a first portion of the display. The processing circuit is further configured to add information corresponding to at least one property in the data record to a set of related objects. The processing circuit is also configured to identify system data associated with the first building automation system object wherein the system data is other than the data record, and to add information regarding the identified system data to the set of related objects. The processing circuit is further configured to display information representative of the set of related objects in a second area of the display,
The first building automation system device operably coupled to the processing circuit. The processing circuit is further configured to provide signals altering the operation of the first building automation system device.
The above described features and embodiments, as well as others, will become more readily apparent to those of ordinary skill in the art by reference to the following detailed description and accompanying drawings.
Brief Description of the Drawings
Fig. 1 is a functional block diagram of the global management system in use with BAS (HVAC) devices, fire safety devices, and security system devices;
Fig. 1 A is functional block diagram of a computing device that carries out the functions of the global management system of Fig. 1, as well as a graphical user interface according to embodiments of the invention;
Fig. 2 is a representative block diagram of a screen display generated by the graphical user interface function carried out by the computing device of Fig. 1A;
Fig. 2A shows a screen capture of the exemplary screen display of Fig. 2 populated by data for a specific building system;
Fig. 3 illustrates a flow diagram of an exemplary set of operations executed by a processing unit as part of the graphical user interface function;
Figs. 4A and 4B illustrate in further detail a first embodiment of the operations of Fig. 3 executed by a processing unit as part of the graphic user interface function;
Fig. 5 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to generate a display element within the operations of Figs. 4A, 4B; Fig. 6 illustrates in further detail a set of operations performed by a processing unit to carry out one of the operations of Fig. 5 in a first embodiment.
Fig. 7 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to generate related item information for display in accordance with the operations of Figs. 4A and 4B.
Fig. 8 illustrates a flow diagram of an exemplary set of operations that may be executed by a processing unit to determine relationships between objects within a building automation system;
Fig. 9 illustrates a representative diagram of an exemplary output file generated by the process of Fig. 8;
Fig. 10 illustrates a representative diagram of an exemplary data image stored in memory of the system of Figs. lAnd 1A; and
Fig. 11 illustrates a representative diagram of an exemplary configuration database of the system of Figs. lAnd 1A; and
Fig. 12 illustrates a representative diagram of elements of an application framework for a user interface function according to an embodiment of the invention.
Detailed Description
Fig. 1 illustrates functional blocks of a first embodiment of global management system 100 implemented in connection with comfort system (HVAC) devices 102, life safety devices 104, and security system devices 106. Together, the management system 100, comfort system devices 102, life safety devices 104, and security system devices 106 form a comprehensive building system 50. It will be appreciated that the comfort system devices 102 preferably cooperate to form a building comfort system, the life safety system devices 104 cooperates to form a building life safety system, and the security system devices 106 cooperate to form a building security system. The management system 100 allows for such diverse systems and devices to be managed, monitored and controlled from a single point, and in a uniform manner.
In Fig. 1, the management system 100 is shown in functional blocks of that are representative of executed software programs and stored data. A block diagram of a computer device 150 implementing the management system 100 of Fig. 1 is shown in Fig. 1A, and discussed further below.
As shown in Fig. 1, the management system 100 includes an application unit or application framework 110, a core engine 112, and a data repository 114. Fig. 1A shows the application framework 110, the core engine 112 and data repository 114 disposed on a single computer workstation. It will be appreciated, however, that any or all of the application framework 110, the core engine 112 and data repository 114 may suitably be distributed on separate computing devices.
The application framework 110 is a collection of software and associated data files that enable a client application. In this embodiment, the application framework 110 enables a system manager application that provides a user interface for monitoring, reviewing and controlling various points and devices in the system 50. The application framework 110 includes, among other things, a main executable 117, a user layout definition file 118, a set of rules 119, a common graphics control module 120, and an infrastructure interface 121. The core engine 112 includes a model/repository 124, a number of software extensions 126i ...126p, a control manager 128, and device interface stacks 129. The data repository 114 includes, among other things, a historical database 130.
Referring first the core engine 112, the model/repository (MR) 124 includes a data server 124iAnd a system database 1242. The system database 1242 includes, among other things data model of all data points and all (or most) devices and other objects in the system 50. In particular, as is known in the art, each value of an active building system (temperature, alarm status, humidity), can be referred to as a point or data point. In this embodiment, objects of the system 50 include anything that creates, processes or stores information regarding data points, such as physical devices (BAS controllers, field panels, sensors, actuators, cameras, etc.), and maintained data files such as control schedules, trend reports, defined system hierarchies, and the like.
Accordingly, the system database 1242 includes, among other things, current values of the various points in the system 50, and configuration information for the various objects in the system 50. The MR 124 is the mechanism by which application framework 110, as well as other applications, can access data generated by the various system devices 102, 104, and 106, and provide data (i.e. commands) to such devices.
As will be discussed below in detail, one type of objects maintained in the system database 1242 consists of hierarchy definitions that identify relationships between objects in the system. These relationships are preferably hierarchical, as will be discussed further below. In particular, it is known to organize objects in building automation systems as a hierarchy. For example, a system may define an object "floor" with multiple child objects in the form of "rooms". Each "room" object, in turn, may have several child objects such as "ventilation damper", "smoke detector", and "temperature sensor". Such hierarchy definitions among objects are conventional in nature, and may take many forms. It will be appreciated that the use of hierarchical files in the system 100 allows for technicians to define nearly any desirable hierarchy, the result of which is stored as one of the defined hierarchical files, discussed further below. In this embodiment, the MR 124 maintains files (i.e. objects) identifying different versions of hierarchies between the objects of the system, including those representative of the devices 102, 104, 106.
The software extensions 126i ...126p are sets of software services that provide core operations of the management system 100 via the model repository 124. The software extensions 126i ...126p are preferably composed in source code, compiled and linked in a manner known in the art. The software extensions 126i ...126p may suitably include a print manager, a reporting subsystem, and a status propagation manager. For example, a reporting subsystem is a system that manages the acquisition of data values from the MR 124 for the generation of various reports. Such reports may include, for example, trends for a temperature of a room or the like. The generation of reports and methods of managing the same using a data model such as the MR 124 is conventional and the details thereof is omitted for clarity of exposition. In another example, the status propagation manager is a device that propagates alarm status information, among other things, to various other data objects in the system. An example of a suitable alarm propagation system is provided in U.S. Patent Application Serial No. 12/566,891, filed September 25, 2009, which is assigned to the assignee of the present invention and is incorporated by reference herein.
The control manager 128 is another software service that enables use of system data via the MR 124. Specifically, the control manager 128 facilitates use of high level scripts to provide services within the management system 100. In other words, in contrast to the software extensions 126i ...126p, the control manager provides an execution environment for high level scripts. In particular, the control manager 128 is configured to execute software scripts that perform various services. In this embodiment, the control manager 128 executes a script to carry out the scheduling function of the management system 100. The scheduling function is used to control points in the various systems based on a time-based schedule. For example, the scheduling function may be used to command temperature set points based on the time of day, and the day of the week, within the comfort system devices 102. It will be appreciated that the scheduling function in other embodiments may simply be implemented as another software extension 126x. However, in this embodiment, the control manager 128 allows for functionality to be added to the management system 100 via scripts, as opposed to low level source code that must be compiled and linked.
The interface stacks 129 are a set of functional modules that act as an interface between the core 112 and the various comfort system devices 102, the various life safety devices 104, and the various security system devices 106. The comfort system devices 102 can suitably include field controllers, actuators, sensors, and other devices normally associated with HVAC systems, all of which may communicate on a suitable network, as is known in the art. The life safety system devices 104 can suitably include notification appliance circuits (NACs), NAC control panels, other controllers, and other devices normally associated with fire safety and/or life safety systems, all of which may communicate on a suitable network, as is known in the art. Similarly, the security system devices 106 can suitably include field controllers, cameras, sensors, and other devices normally associated with security systems, all of which may communicate on a suitable network, as is known in the art. One or more of the devices 102, 104And 106 may operate on a specific network protocol, such as BACnet or LonTalk. The interface stacks 129 provide access to data within such network protocols by services within the management system 100.
Referring now to the application framework 110, in the embodiment described herein, the application framework 110 is an application and other software components that cooperate to provide a multi-area or multi-pane display window on a computer display, such as the display 156 of Fig. 1A discussed further below. The multi-pane display (see e.g. Figs. 2 and 2A) include an objection selection area 215, primary and context display areas 220, 225 for displaying information about a selected object and elements within the object, and an automatically generated "related items" area 230 that displays other objects that bear a predefined relationship to the selected object. Many or most areas have selectable links to additional information. Fig. 12 shows in further detail the application framework 110. With reference to Figs. lAnd 12, the infrastructure 121 is a software module that acts as an interface, similar to an application programming interface (API), to the various elements of the core engine 112, including the MR 124. The main executable 117 includes programming instructions that coordinate the actions of the other modules of the application framework 110. The common graphic controls 120 include graphic libraries for various objects and points. For example, the graphics controls 120 may include a thermometer graphic definition for temperature sensors, or a
"speedometer" type graphic definition for pressure sensor sensors. The graphics controls 120 are common files (or copies of common files) that are used by multiple applications, and not just the application framework 110. The layout 118 is a file that defines the display format, such as whether graphical displays, text displays, camera images or the like are to be displayed. In the embodiment described herein, the layout 118 is defined in connection with each user's profile, or in the alternative, with an authorization level or other user selection. Thus, the appearance of the display can vary from user to user, even though such users employ the same application framework 110.
The rules 119 define how the application framework 110 populates the various areas of the display based on the defined layout 118. It will be appreciated that while the layout 118 can vary by user, the rules 119 do not. However, in contrast to the common graphic controls 120, the rules 119 are specific to the application framework 110.
Referring now to Fig. 2, there is shown a diagram of a display screen 200 generated by the application framework 110. As briefly mentioned above, the display screen 200 includes a first window 202, a second window 204And a third window 205. The first window 202 includes a selection area 215, also referred to as a system browser, a primary work area 220, a contextual work area 225, a related items area 230, and a secondary work area 235. In this embodiment, the second window 204 includes alarm notification area 210. Other windows may be incorporated. While outside the scope of the current invention, the alarm notification area 210 may suitably appear as, and be generated as, shown in European Patent Specification EP 1515289 Bl, which is assigned to the Assignee of the present invention and is incorporated herein by reference. The third window 205 may include details of certain events, and is also outside the scope of the present disclosure.
Reference is also made to Fig. 2A, which shows an example of the display screen as populated by data from an exemplary system, with the exception the third window 205. As shown in Fig. 2A, the alarm notification area 210 includes a plurality of icons 210i, 2102, 2103, 2 I O4, 2 I O5, 2106 and 2107, each indicating a type of fault or alarm, and the quantity of the given fault in the current state of the system. For example, the icon 210i illustrates that five severe conditions exist, the icon 2102 shows that five alarm conditions exist, and the icon 2107 shows that six advisory notifications exist. In general, the user may drill down to each type of notification by selecting one of the icons 210i, 2102, 2103, 2104, 2105, 2106 and 2107. As discussed above, however, the details of the operation of the alarm notification area 210 are beyond the scope of the present disclosure.
Referring still to the exemplary screen capture of Fig. 2A, the selection area 215 includes a hierarchical list 218 of objects (e.g. objects 218i, 2182, 2183, and 2184) for a building campus. The hierarchical list 218 is based on a hierarchical definition file stored in memory, as will be discussed further below. The hierarchy logic employed in the list 218 may take a plurality of different forms. In the example of Fig. 2A, the hierarchy logic is geographical, or space-based. Accordingly, the first or highest level of the hierarchical list 218 includes buildings such as "Headquarters" and "Main Building". The next highest or second level of the hierarchical list 218 includes floors and/or large areas of the buildings, such as "Floor 1", "Floor 2",
"Auditorium", and "Floor 4". The third or next highest level of the hierarchy includes rooms and/or smaller divisions of the floors/large areas of the second level. For example, the icon 2184 is a room "Room 002", which is part of the sub-list for
(sometimes referred to as a "child" of) icon 2183, which represents "Floor 4". The icon 2183, moreover, is part of the sub-list for (or childe of) icon 2182. which represents the object "Main Building". This particular hierarchical string illustrates that "Room 002" is a child of "Floor 4", which in turn is a child of "Main Building".
Referring again generally to Figs. 2 and 2A, the user may select any object from the selection area 215. As will be discussed below in further detail, the system 100 thereafter causes the various areas 220, 225, 230 to be populated with data corresponding to the selected object. To this end, as will be discussed below, the rules 119 of the application framework 110 in conjunction with the layout 118 cooperate to define the appearance of the display elements in each of the areas 220, 225, and 230, as well as the selection area 215 and the other windows 204, 205.
In general, the primary work area 220 includes information relating directly to the selected object from the selection area 215. As shown in Fig. 2A, the icon 2184 for the Room 002 has been selected, and primary work area 220 shows a perspective floor plan 222 for the selected object "Room 002". The primary work area 220 can alternatively display textual data, drop down lists, and even a document (e.g. a pdf formatted file). As will be discussed further below in detail, the format of the data that is presented in the primary work area 220 will depend upon, among other things, the layout file 118 of the application framework.
The contextual work area 225 is an area of the display 200 that contains information regarding a specific element within the primary work area 220. For example, if the primary work area 220 includes several selectable icons or menu entries, then the contextual work area 225 is used to provide further information relating the user selection from the primary work area 220. In Fig. 2, for example, the primary work area 220 includes selectable icons 211 lAnd 2112. If the user were to select one of the icons 2112, then the contextual work area 225 would provide further information regarding the object associated with the icon 2112. By contrast, in the example of Fig. 2A, the primary work area 220 does not include selectable icons. In such a case, the contextual work area 225 is used to simply provide more information about the Room 002. Thus, the contextual work area 225 in Fig. 2A shows the properties of the object 222 shown in the primary work area 220. As a consequence, the contextual work area 225 provides an ability to "drill down" on a select element (e.g. elements 21 or 2112 of Fig. 2) displayed in the primary work area 220, or, as shown in Fig. 2A, provide further information about object's properties that cannot be shown in the format of graphic of the primary work area 220.
The related items work area 230 is a portion of the display screen 200 that includes selectable icons corresponding to other "objects" of the system 100 that relate in a predetermined manner to the selected icon within the primary work 220. For example, the related items area 230 of Fig. 2 includes selectable icons 23 , 2322, and 2323. The icons 23 , 2322, and 2323 link to objects related in some manner to the object represented by whichever of the icons 2111 or 2112 is selected. The related objects that can include schedules that affect or involve the selected object, reports relating to the object, and other objects identified in properties for the selected object. Further details regarding the generation of the related items area is provided below in connection with Figs. 4-7.
By way of example, consider that that the icon 2112 has been selected in the primary work area 220 of Fig. 2. In such a case, the related items icons 2311 , 2312 and 2313 comprise links to information about objects related to the object of the selected icon 2112. If, as in Fig. 2A, the primary work area 220 does not have selectable icons, then, as shown in Fig. 2A, the related items area 230 includes selectable icons for objects related to the Room_002 of the primary work area 220. In the example of Fig. 2A, the related items can include floor plan graphics for areas close to or involving the object Room 002, as well as a set of reports relating Room 002. The secondary area 235 is an area in which information pertaining to a second object selected may be displayed. For example, the system 100 allows the user to select an object from the related items area 230. Information related to the selected related item is then displayed in some cases in the secondary area 235. It will be appreciated that the areas 215, 220, 225, 230 and 235 can be re-sized by the user. Thus, to at least some degree, the user may adjust the relative sizes of the various areas. Suitable graphic tools that facilitate such scalability are known in the art. Accordingly, for example, the secondary area 235 may be collapsed completely to maximize the primary work area, as shown in Fig. 2A.
It will further be appreciated that the areas 215, 220, 225, 230 and 235 also employ standard scrolling tools. In particular, to the extent that all of the information to be displayed does not fit within the any of the areas 215, 220, 225, 230 and 235, standard scroll bars are employed to allow the user to maneuver to different information within the corresponding area. For example, Fig. 2A shows a standard vertical scroll bar 255 for the contextual work area 225 and a standard vertical scroll bar 260 for the related items area 230. In the conventional manner, the vertical scroll bar 255 allows the user to access further information that is currently hidden in the area 225. Similarly, the vertical scroll bar 260 allows the user to access additional related items information in the area 230. Though not shown in Fig. 2A, horizontal scroll bars may similarly be used when warranted.
Accordingly, referring again to Fig. 1, the application framework 110, when executed by a suitable computer processing circuit, facilitates user interaction through the display screen 200 of Figs. 2 and 2 A. Other applications may be used to facilitate other actions by users of the system 100.
As discussed above, the application framework 110 provides for different appearances of the display elements that appear in the windows 202, 204And 205 based on a user profile, or an authorization level of a user. The layout file 118 contains the display format specific to the user. In particular, Fig. 12 shows the application framework 110, and specifically, the layout file 118, in further detail. The layout file 118 defines a display format including a plurality of window definitions 1205i, 12052, a plurality of panes or area definitions 12101, 12102 within the one or more of the windows 1205i, 12052, and a plurality of snap-in tools 1220i...1220m employed within the window and/or area definitions.
The window definitions 1205i, 12052 define the appearance of the windows, e.g.
windows 202, 204And 205 of Fig. 2. To this end, within each window definition 1205x there may be identified one or more areas 1210x, and/or one or more snap-in tools 1220y. The area definitions 12101, 12102 define the appearance of areas within the windows defined by the definitions 1205i, 12052, for example, the areas 220, 225 and 230 of Fig. 2. The snap-in tools 1220i...1220m provide the actual format of the display elements within each of the areas 12101, 12102, and/or windows 1205i, 12052.
To this end, a snap-in tool is a software script or program that generates a predetermined layout of data, menus, graphic controls, and the like. The snap-in tools 12201 ... 1220m are configured such that, when the layout file 118 identifies a snap-in tool 1220y for display of a particular set of object data, the snap-in tool 1220y (when executed by a processor) displays the object data in a predetermined layout. In this embodiment, a first snap-in tool 12201 may be used to generate graphic floor plan views, such as that shown in Fig. 2A in the primary work area 220 of Fig. 2A. Another snap-in tool 12202 may be used to generate ordered sets of text data. Another snap-in tool 12203 may be used to generate an arrangement of dialog boxes and other interactive widgets or elements, such as those shown in the contextual work area 225 of Fig. 2A. Still another snap-in tool 1220m may be used to display video data from a camera object. It will be appreciated that the same snap-in tools 1220y may be used in multiple panes or areas 1210i, 12102 and multiple areas 1205i, 12052. Typically, as will be discussed below, when multiple snap-in tools are defined for an area, only one generates the display, while the others can be accessed by selectable tabs, such as tabs 224 ls 2242 and 2243 of Fig. 2.
In general, the snap-in tools 12201...1220m in this embodiment are configured to generate displays by accessing various properties of objects. In general, the snap-in tools 12201...1220m are modular library tools that may be implemented by any running instance of the application framework 110. As discussed above, the layout file 118 may identify that a particular data "object" is to be displayed using a particular snap-in tool 1220x. In such a case, the snap-in tool 1220x generates the display by accessing predetermined sets of properties of the data object, and then using that data to construct details of the display. Further detail regarding the operation of the snap-in tools 12201 ... 1220m is provided below in connection with Figs. 4A, 4B, 5 and 6.
As discussed above, the elements of the management system 100 of Fig. lAre shown as functional units. Fig. 1A shows an exemplary embodiment of the management system 100 implemented in a commercially available general purpose computer 150. In this embodiment, the management system 100 includes a central processing unit and associated support circuitry (CPU) 152, a plurality of network input/output (I/O) units 154i ... 154r, a display 156, a user input device 158, a primary memory 160, and a secondary storage device 162. The CPU 152 is configured to execute programming instructions stored in the memory 160 to carryout various operations as described herein. In accordance with such instruction, the CPU 152 is configured to receive data inputs from the user input 158, generate display screens to be displayed on the display 156. The CPU 152 is also configured to communicate with external devices, such as the system devices 102, 104, 106, via one or more of the network I/O units 154i ...154r. To facilitate the above-described functionality, as well as other functionality, the CPU 152 is operably connected in a conventional manner to each of network I/O units 154i ...154r, the display 156, the user input 158, the primary memory 160, and the secondary storage 162 via a system bus In this embodiment, the primary memory 160 stores programming instructions for the application framework 110, the extensions 126i ...126p, the control manager 128, and software elements of the stack interface 129. The primary memory 160 also stores the elements of the model/repository, which includes the data server 124iAnd the data base 1242. To this end, the primary memory 160 may include volatile memory such as random access memory, as well as other types of readable and writeable memory.
The database 1242 is a database containing active system values and data, as well as configuration data for the elements of the system. Fig. 10 shows a functional diagram of contents of the database 1242. For example, the database 1242 includes present (or latest) values 1005 for the various points of the system 50, including values (e.g. temperatures, set points, fan speed, etc.) of the devices 102, 104And 106. The database 1242 also includes alarms or notifications 1010 and their corresponding statuses. The database 1242 further includes schedule files 1015 identifying control schedules. As discussed above, the schedules define a set of timed command values to be communicated to various elements in the systems 102, 104And 106. In a simple example, a schedule may command the comfort system to employ one set of temperature set points during work hours, and another set of temperature set points on evenings and weekends. In the embodiment described herein, the schedules 1015 are in the form of scripts that are implemented by the control manager software 128. However, it will be appreciated that in other embodiments, the schedules 1015 may be implemented as a software component and set of corresponding schedule data files.
The database 1242 further stores user profile information 1020. The user profile information 1020 includes, for each authorized user, a specific layout file that is to be used as the layout file 118 when that user runs the application framework 110. The database 1242 also includes hierarchical files 1025 defining one or more sets of hierarchical relationships between data objects within the system. In particular, as discussed above, the "objects" of the system 50 may be defined within a hierarchy. These "objects" can include various devices 102, 104, 106
(discussed further below), the schedule files 1015, one or more stored reports, and various rooms, floors and buildings in which the system 50 is located. Accordingly, hierarchical files 1025 can identify hierarchical relationships between buildings, devices, and even schedules and reports.
The database 1242 also includes object configuration data 1030. Object configuration data 1030 includes a data record for each object of the system. Thus, for example, each room, floor, building, sensor, camera and field controller has its own object configuration data record. Fig. 11 shows in further detail a representative diagram of the maintained in the database 1242.
As shown in Fig. 11, the object configuration data 1030 includes a set of data object records 1105 associated with each of the devices 102, 104And 106 in the system 50, and a set of data object records 1110 associated with each room, space and building in which the system 50 is located, among other things. The objection configuration data 1030 may further includes object records associated with other logical entities, such as reports, not shown.
Each object record 1105, 1110 includes a set of predetermined properties, including unique identifying information <ID> and an object type <OBJECT_TYPE>. Several objects may be of the same object type. For example, an object type may be "sensor", "controller", "floor", "room", "hierarchy" etc. The number and type of properties of each object record 1105, 1110 depends on the object type. Each object record 1105, 1110 may also contain one or more point properties <POINT> identifying a point value corresponding to the object. As is known in the art, "points" are used to describe operating values of the system, such as temperatures at particular sensors, set points for various actuators or air handling units, and the like. Each object may be associated with one or more points. The same point may be associated with a plurality of objects. Thus, for example, the point T 32, which may represent a temperature sensed by a sensor TEMP S 02 that is located within ROOM 002 may be a point property for both the object record 1 105 for TEMP S 02 and the object record 1 1 10 for ROOM 002. (See Fig. 1 1).
In addition to identification information, object type information, and point properties, the object records 1 105, 1 1 10 may suitably have many other properties, including references to a graphic element, a pdf document, manufacturing information, maintenance information, and the like.
The object records 1 105, 1 1 10, further include a related items property <RI> that identifies related items for the objects represented by records 1 105, 1 1 10. The related items for an object can include a reference or link to a video image associated with the object (i.e. from a video camera in the room ROOM 002), a trend report associated with the object, and the like.
Referring again to Fig. 1 , the system database 1242 is operably accessed and maintained by the data server 1241. More specifically, the data server 124i is a software program that (when executed by the CPU 152, manages the data in the system database 1242, including the management of the service that obtains system data from the devices 102, 104And 106, and communicates changes or commands to the devices 102, 104And 106.
The secondary storage 162, which may suitably be non- volatile storage, stores the system historical data 130 and other reference information, such as a pdf document library 168.
Referring again to Fig. 1A, the document library 168 may suitably be a set of pdf files that are associated with various of the devices 102, 104And 106. It will be appreciated that the secondary storage 162 may also store other files typical of a building control system, such as, for example, the historical database 130.
In general, the CPU 152 executes the operations of the software elements 1 10, 124l s 126- i . .. 126p, 128 and 129 to perform the operations of management system 100 as described herein. Specifically, the CPU 152 performs the operations of Figs. 3, 4A, 4B, 5, 6 and 7, as discussed further below, to carry out the a system manager application of the application framework 1 10. The CPU 152 may also suitably perform the operations of Fig. 8, discussed further below. Before discussion of the specific operations of the system 100 of Figs. lAnd 1A, the general operation of the system 50 will be described. In the general operation of the system 50, the comfort system devices 102 operate to provide heating, ventilation and air conditioning to the building in accordance with normal practices using any suitable conventional techniques.
Similarly, the life safety devices 104 operate to provide monitoring for, and if necessary, notification of, a hazardous condition such as fire, smoke or toxic gas release. Finally, security system devices 106 operate to provide motion sensing, video surveillance information, and door position monitoring, and the like in accordance with normal security system practices.
In general, the CPU 152 employs the data server 124i to exchange data with at least some of the devices 102, 104And 106 (directly or indirectly) via the interface stack software 129 and network I/O units 154i ...154r. The CPU 152 maintains the system database 1242 based on, among other things, the received data from the devices 102, 104And 106. In another aspect of operation, the CPU 152 conveys command values from various elements in the management system 100 to the various devices 102, 104, 106, also via the interface software 129 and network I/O units 154i...154r. For example, by executing various scheduling scripts 1015 via the control manager 128, the CPU 152 may communicate scheduled commands to various devices 102, 104And 106 via the interface stack software 129 and the network I/O units 1541...154r.
Fig. 3 shows in general terms a process flow of an exemplary set of operations of the CPU 152 executing the user interface application framework 110. Figs. 4A, 4B, 5, 6 and 7, discussed further below, show in further detail how the operations of Fig. 3 may be carried out.
Referring to Fig. 3, in step 305, the CPU 152 receives a user input signal identifying a first of the plurality of building automation system objects to be displayed. For example, the CPU 152 may receive a selection from a plurality of selectable objects in the selection area 215 of Fig. 2 or Fig. 2A. Thereafter, the CPU 152 obtains a first set of object data regarding the selected building automation system object from one or more data records associated with the first building automation system object. To this end, the CPU 152 may suitably obtain configuration data for the selected object (i.e. data records 1105, 1110 of Fig. 11) from the configuration object data 1030 of the database 1242, and obtain system values 1005 (see Fig. 10) related to the selected object from the database 1242. The first set of object data may suitably include selectable links to other objects, such as child objects of the selected object. For example, if the select object is a floor of a building, the first set of building data may include a graphic for that floor, and links to temperature sensors located on that floor. Once the first set of object data is obtained, the CPU 152 thereafter proceeds to step 315.
In step 315, the CPU 152 displays (via the display 156) information regarding the first set of object data in the primary work area 220 of the display. By way of example, the CPU 152 may suitably display a graphic depicting or representing the object, or values associated with the object, in the primary work area 220 of Fig. 2. For example, in the example of Fig. 2A, the CPU 152 displays a graphic of the selected object room 002.
In addition, in step 320, the CPU 152 reviews system data, including dynamic data, to determine a set of related objects corresponding to one or more elements of the first set of object data. For example, the CPU 152 may review schedule files or other files to determine whether any schedules implicate or relate logically to the selected building automation system object, or some child object of the selected building automation system object. The CPU 152 may execute step 320 contemporaneous to, before, or after step 315.
After step 320, the CPU 152 executes step 325. In step 325, the CPU 152 displays information regarding the set of related objects in another portion of the display, while the information in the primary work area remains displayed. For example, referring to Fig. 2A, the CPU 152 may display selectable icons for the related objects in the related items area 230 while the graphic 222 remains displayed in the primary work area 220. The above steps provide a functionality in which the user not only receives information relevant to a selected building automation system object, but further receives icons identifying selectable additional objects. Related items may be identified from the related items property <RI> of the objection configuration data 1030 (see Fig. 11). Moreover, the related items may be dynamically determined based on system data, such as schedules, reports or the like. This provides the user with more options to navigate intuitively through the system 100.
Figs. 4 A and 4B show in further detail an exemplary embodiment of the operations of Fig. 3. Initially, the CPU 152 in step 402 receives a request to start the application framework 110 via the user input 158. The request input includes user login information, such as name and password or other authentication information. The CPU 152 determines whether the user authorization value is valid. If not, the CPU 152 terminates operations of Fig. 4, or returns to step 402 to prompt for a new request. However, if the CPU 152 determines that the user authorization level corresponds to the required authorization value, then the CPU 152 proceeds to step 405.
In step 405, the CPU 152 instantiates the application framework 110 as an operating execution sequence. To this end, CPU 152 obtains the layout file 118 for the user from the corresponding user profile 1020. This user profile 1020 for the user identifies the snap-ins 1220i...1220m for the various windows 202, 204, 205, and for the various areas 215, 220, 225, 230 and 235 of the window 202. As discussed above, different user profiles may identify different snap-in tools for each of the windows and areas, and the same snap-in tools may be identified for multiple of the areas. The CPU 152 then continues (via the main executable 107) in step 406.
In step 406, the CPU 152 receives, via the user input 158, a request to review a specific system or hierarchy file 1025 (see Fig. 10). To this end, the CPU 152 obtains the selected hierarchy file 1025 from the database 1242. For example, the user may request to retrieve a geographical hierarchical file of a specific building campus, such as the one illustrated in the selection area 215 of Fig. 2A. Accordingly, in one example, a first of the hierarchy files 1025 may define a geographical hierarchy, such as the hierarchy shown in the selection area 215 of Fig. 2A. A second of the hierarchy files 1025 may define a mechanical hierarchy, for example, corresponding to the flow path of chilled or heated air or water through the system 50. In such a mechanical hierarchy, for example, a "building" object on may be associated with several "child" air handling unit objects. Each of the air handling units may in turn be associated with a plurality of "child" objects for ventilation dampers. Other hierarchies may be defined for any given building automation system, and would be a matter of design choice. In this embodiment, the user may in some cases select from a plurality of defined hierarchy files 1025 that include any or all of the data objects, including but not limited to those associated with building spaces and automation system devices 102, 104And 106.
It will be appreciated that the user in this embodiment is limited to a set of hierarchies based on the user's authorization level.
Referring again to Fig. 4A, the CPU 152 in step 408 generates a default object selection value for generation of the initial display. The default object selection value can identify one of the objects of the selected hierarchy. The default value may suitably comprise the highest object in the selected hierarchy. In other cases, the CPU 152 sets the default objection selection value may be set to null, in which case, nothing object information is displayed until a user choice is made. In either event, the selection value CUR OBJ is set equal to the generated default object selection value.
In step 410, the CPU 152 displays, in the selection area 215 of the display screen 200 on the display 156, the hierarchy defined by the selected hierarchy file 1025. Using standard graphic user interface techniques, the CPU 152 also enables each of the objects identified on the displayed hierarchy to be selectable using the user input 158. For example, the CPU 152 allows the user to select any of the list items of the hierarchical list 218 of Fig. 2A. It will be appreciated that to execute step 410, the CPU 152 employs the rules 119 to cause the selection area 215 to display the hierarchical information in the hierarchical file 1025 using the identified snap-in 1220x for the selection area 215 as defined in the layout file 118 (See. Figs. 2, 10 and 12).
Thereafter in step 412, the CPU 152 determines whether it has received an input from the user input device 158 identifying a new user selection in the selection area 215. If so, then the CPU 152 proceeds to step 414. If not, then the CPU 152 proceeds directly to step 416. In step 414, the CPU 152 sets CUR OBJ equal to the user selection. After step 414, the CPU 152 proceeds to step 416.
In step 416, the CPU 152 populates the primary work area 220 of the display screen 200 based on current object CUR OBJ, the layout file 118, and system data from the database 1242. As discussed above, the graphic and/or text information presented in the work area 220 can have several different types of appearance, ranging from a graphic with or without interactive elements, sets of values with dialog boxes for changing the values, selectable text icons within selectable drop down menus, simple text lists or tables, pdf image documents, and/or live video feed. The display format is determined by the layout file 118 (see Fig. 12), and in particular, the one or more snap-in tools 1220x identified within the layout file for the primary work area 220. While the snap-in tools 1220x define the format of the display, such as graphical, textual, video, and/or arrangements of pull-down menus and dialog boxes, the content or values within the display element depend on configuration data and/or system data (from the data image 1242).
In particular, Fig. 5 shows in further detail a set of operations employed by the CPU 152 to generate the display element in any window or work area, such as in the primary work area 220. As will be noted below, the CPU 152 employs the same set of operations to generate the display element of the contextual work area 225 and the secondary work area 235, discussed further below.
Referring briefly to Fig. 5, the CPU 152 in step 505 obtains the object selection to be displayed, OBJ, and an identification of the window/area, PANE, in which it is to be displayed. In the case of step 416, the object selection OBJ will be set equal to CUR OBJ, and the value PANE is equal to the primary work area 220. In the case of step 422, discussed further below the object selection Oi?Jin step 505 will be set equal to CONT OBJ, and PANE will be set to the contextual work area 225. In the case of step 432, also discussed further below, the object selection Oi?Jin step 505 will be set equal to the selected related object, and PANE will be set equal to the secondary work area 235.
In any event, in step 510, the CPU 152 references the layout file 118 to determine all snap-in tools 1220x...1220y that are identified for the current area being generated or populated, PANE. Consider an example wherein PANE is the primary work area 220, and wherein the area definition 1210i of the layout file 118 of Fig. 12 corresponds to the primary work area 220. In such a case, the CPU 152 would determine (based on the definition 1210i) that snap-in tools 1220i, 12202, and 12203 are to be used to generate the display elements of the primary work area 220.
Once the snap-in tools 1220x identified in the area definition 1210x corresponding to the area PANE have been identified, the CPU 152 then processes each of the identified snap-in tools in steps 515 to 525.
In step 515, the CPU 152 determines, for one of the identified snap-in tools 1220x, whether the object Oi?Jhas properties or data appropriate for the snap-in tool definition. To this end, it will be appreciated that all objects do not have properties or data appropriate for all display formats. For example a snap-in tool 1220x may be a video image output. If the object OBJ is a room object for a room having a video camera, then the CPU 152 would determine that snap-in tool 1220x was appropriate for the object OBJ. If, however, the object OBJ is a temperature sensor, then CPU 152 may determine that the snap-in tool 1220x is not appropriate for the object OBJ. In most cases, step 515 may be carried out by determining whether the configuration data record for the object Oi?Jhas properties expected by the snap-in tool.
Referring generally to step 515, if the CPU 152 determines that the object Oi?Jhas properties appropriate for the select snap-in tool 1220x, then the CPU 152 proceeds to step 520. If not, then the CPU 152 proceeds to step 525.
In step 520, the CPU 152 adds the snap-in tool 1220x to the list of snap-in tools to be executed in generating the display area for PANE. Thereafter, the CPU 152 proceeds to step 525. In step 525, the CPU 152 determines whether all snap-in tools identified in the layout file 118 corresponding to the area PANE have been processed. If so, then the CPU 152 proceeds to step 530. If not, then the CPU 152 returns to step 515 to process another of the snap-in tools identified in step 510.
In step 530, the CPU 152 generates a display element for the area PANE (e.g. primary work area 220, contextual work area 225 or related items area 230) using the primary snap-in tool on the generated list (step 520) of snap-in tools. In particular, although multiple snap-in tools may be on the generated list for the display in the area PANE, the CPU 152 only displays one of the snap-in tools. To this end, the layout file 118 further includes a prioritization of snap- in tools appropriate for each window or area. As a default, the snap-in tool on the generated list of snap-in tools with the highest priority constitutes the primary snap-in tool. In step 530, the CPU 152 employs the primary snap-in tool and the properties of the object OBJ to generate a display element in the area PANE. Further detail regarding the generation of a display element as per step 530 is provided below in connection with Fig. 6. After step 530, the CPU 152 proceeds to step 535. In step 535, the CPU 152 causes a selectable tab (e.g. 224iAnd 2242 of Figs. 2 and 2A) to be displayed in the area PANE (e.g. area 220) for all other identified snap-in tools on the list generated in step 520. Such tabs (e.g. 224 ls 2242) allow the user to select another display format for the same display area PANE and the same object OBJ. For example, while the primary work area 220 of Fig. 2A shows a floor plan graphic 222 of the object Room_002, the user may select the icon 2242 to display a text description of the object Room_002.
To this end, at any time such a tab is selected, the CPU 152 sets the primary snap-in tool equal to the that corresponding to the selected tab, and executes again steps 530 and 535. In this manner, the user is made aware of available, alternative display formats for properties and/or values of the object OBJ in the relevant area/window PANE.
Fig. 6 shows an exemplary operation of the CPU 152 in generating a display element for an object OBJ using a snap-in tool 1220x. The operations of Fig. 6 are generalized for all snap-in tools.
Referring now to Fig. 6, in step 605, the CPU 152 executing a snap-in tool 1220x obtains the object data record (e.g. data record 1105 or 1110 of Fig. 11) from the database 1242 for the object OBJ. In step 610, the CPU 152 retrieves from the object data record any static properties to be used in generating the display. To this end, each snap-in tool 1220x references a set of property types used by various data objects in the system. In step 610, the CPU 152 retrieves the property values of OBJ for the property types required by the particular snap-in tool 1220x. For example, if the snap-in tool requires the associated graphic (if any) property of the object OBJ, then in step 610, the CPU 152 obtains whatever value, link or other information is stored in the <graphic> property of the data record 1105 or 1110 (see Fig. 11) for the select object OBJ.
In step 615, the CPU 152 retrieves from the database 1242 any dynamic operating data required by the snap-in tool 1220x that corresponds to the selected object OBJ. For example, if the object OBJ were a temperature sensor, then the CPU 152 in step 615 may suitably retrieve from the database 1242 (via a reference within the object data record) the temperature value sensed by the corresponding physical sensor.
Once the CPU 152 has all of the information for the display element (i.e. the graphics and/or text to be displayed in the primary work area 220) that will be generated by the snap-in 1220x, the CPU 152 in step 620 generates the actual display element using the retrieved configuration data and retrieved operating data. As discussed above, the generated display element may be a graphic, a text table, an interactive set of text and pull-down menus, or any typical interactive screen elements.
Accordingly, the steps of Figs. 5 and 6 present one way in which step 416 of Fig. 4A (as well as any steps involving populating an area of the display 200) may be carried out. The generation/population of other display areas is carried out in an analogous manner.
Returning again to Fig. 4A, once the display element for the primary work area 220 has been generated, the CPU 152 proceeds to step 418. In step 418, the CPU 152 determines a default object selection for the context work area 225. In particular, as discussed above, the context work area 225 provides select additional information regarding the selected object CUR OBJ from the selection area 215. Moreover, most displays in the primary work area 220 include additional links or selectable icons to other objects, such as "child" objects, or contained objects, of the selected object. The CPU 152 causes such "child" objects to be selectable within the primary work area 220. Icons 2111 and 2112 of Fig. 2 illustrate examples of such selectable object icons. In such cases, the user may select additional information (drill down) by selecting the object icon or link within the primary work area 220.
Consider an example wherein the selected object CUR OBJ is a room, and the display element in the primary work area 220 includes selectable icons or text boxes identifying sensors and actuators within that room. The user may select one of the sensors or actuators within the primary work area 220 in order to obtain additional information regarding the sensor in the contextual work area 225. This object selected within the primary work area 220 by the user is referred to herein as the contextual object, CONT OBJ.
However, prior to any user selection of a contextual object, the CPU 152 determines a default contextual object CONT OBJ to display in the contextual work area 225. Accordingly, in step 418, the CPU 152 determines this default contextual object CONT OBJ based on the selected object CUR OBJ and the snap-in program 1220x used for the generation of the display element in the primary work area 220.
After step 418, the CPU 152 executes step 420. In step 420, the CPU 152 populates the contextual work area 225 of the display screen 200 based on the current contextual object CONT OBJ, the layout file 118, and system data from the database 1242. Similar to the primary work area 220, the graphic and/or text information presented in the contextual work area 225 can have several different types of appearance, ranging from a graphic with or without interactive elements, sets of values with dialog boxes for changing the values, text with selectable drop down menus, simple text lists or tables, pdf image documents, and live video feed. As in step 416, the display element type within the contextual work area 225 depends on the object selected (CONT OBJ) and the layout file 118 obtained from user profile 1020. The content and values within the display element in the contextual work area 225 depend on configuration data and/or system data (from the data image 1242). To generate the contextual work area display in step 422, the CPU 152 carries out the operations of Figs. 5 and 6, similar to step 418.
After step 420, the CPU 152 executes step 422. In step 422, the CPU 152 determines the related items for the related items area 230 based on the contextual object that has been selected from within the primary work area 220, in other words, the object CONT OBJ. The CPU 152 furthermore displays information and/or links corresponding to the determined related items. To this end, the CPU 152 preferably carries out the operations of Fig. 7. However, as general matter, the CPU 152 identifies the related items as items that bear a relationship to the object
CONT OBJ.
In the embodiment described herein, the related items include any schedules on which points or data values of the object CONT OBJ appear, any existing reports for the CONT OBJ, and new reports for the object CONT OBJ. To this end, the related items include static elements listed in the properties of the object configuration data for the object CONT OBJ, as well as dynamic elements such as schedules generated in subsystems of devices 102, 104, 106 and other non-property elements involving CONT OBJ.
After step 422, the CPU 152 proceeds to step 424. In step 424, the CPU 152 determines whether the user has provided, via the user input 158, a new selection of an object from within the primary work area 220. In other words, the CPU 152 determines whether it has received an input identifying a new contextual object. As discussed above, the primary work area 220, which displays information about CUR OBJ, displays selectable icons or links to further information regarding CUR OBJ, such as "child" objects or related files. Such objects are typically defined or referenced in the configuration properties of the data object CUR OBJ. In step 424, the CPU 152 determines whether it has received an input selecting a link or icon from the primary work area 220.
If so, then the CPU 152 proceeds to step 426. If, however, the CPU 152 does not detect a new input selecting a link or icon from the primary work area 220, then the CPU 152 proceeds to step 428, discussed further below. In step 426, the CPU 152 sets CONT OBJ equal to the new selection. After step 426, the CPU 152 returns to steps 420 and 422 in order to update the contextual work area 225 and related items area 230 accordingly.
Referring to step 428, the CPU 152 determines whether any of the related items has been selected from the related items area 230. If not, then the CPU 152 proceeds to step 436 to determine whether other inputs have been received. If so, however, then the CPU 152 proceeds to step 430 to process the selected related item.
In step 430, the CPU 152 first determines whether a selected toggle/button is in the "on" state. The selected toggle/button is a user graphic control (see toggle graphic control 238 of Figs. 2 and 2 A.) that allows the user to dictate whether information pertaining to a selected related item is to be displayed in the primary work area 220 or the secondary work area 230. If the CPU 152 determines that the toggle/button 238 is in the on-state, then proceeds to step 432. If not, then the CPU 152 proceeds to step 434.
In step 432, the CPU 153 populates the secondary work area 235 with information related to the selected related object. Similar to the generation of displays of other objects, such as CUR OBJ and CONT OBJ, the CPU 152 in step 432 performs the operation of Figs. 5 and 6 to populate the secondary work area 235. After step 432, the CPU 152 proceeds to step 436.
By contrast, in step 434, the CPU 152 sets CUR OB J to the selected related item. The CPU 152 thereafter returns to step 416 to display information related to the newly defined CUR OBJ within the primary work area 220. The CPU 152 thereafter proceeds as described above after step 416.
Referring now to step 436, the CPU 152 determines whether any user input data has been received (via input device 158) in any of the work areas, such as the primary work area 220, the secondary work area 235, and the contextual work are 225. If so, then the CPU 152 proceeds to step 438. If not, then the CPU 152 proceeds to step 440. In step 438, the CPU 152 processes the input. If the input relates to a commanded point, such as a temperature set point, a camera control, or other command value, then the CPU 152 provides the value to the data image 1242. It will be appreciated that the data server 124i thereafter causes the command value to be forwarded to the appropriate device of the devices 102, 104And 106 via the interface 129. If the input relates to some other function, such as a schedule or report, the CPU 152 causes the relevant data record relating to said schedule or report is updated. Other methods of processing input data relating to building automation systems may readily be implemented. After processing the input in step 438, the CPU 152 proceeds to step 440.
In step 440, the CPU 152 determines whether any updates have been received to the information displayed in any of the areas 220, 225 and 235. In particular, because some of the data displayed in the areas 220, 225, and 235 may comprise or relate to active sensors, cameras or controlled devices (of the devices 102, 104, 106), the outputs of such devices can change. Such changes propagate to the database 1242, as well as to various reporting functions and other software programs (e.g. software extension 126i...126p and/or control manager 128). The CPU 152 obtains notification of any changes to values displayed (or affecting the appearance of) any of the display elements in the area 220, 225 and 235. If the CPU 152 determines that a relevant value has changed, then the CPU 152 executes step 442. If not, then the CPU 152 proceeds directly to step 444.
In step 442, the CPU 152 refreshes or updates the display elements in the areas 220, 225 and 235 using any changed values. To this end, the CPU 152 may simply perform the operations of Fig. 5 and 6 for each of the areas 220, 225 and 235. After step 442, the CPU 152 proceeds to step 444.
In step 444, the CPU 152 determines whether any other inputs received via the user input device 158 need to be processed. To this end, it will be appreciated that the display screen 200 may suitably include multiple other selectable features that enhance the utility of the display screen 200. Such features may take a plurality of formats. Such features can include selectable tabs (e.g. 224iAnd 2242) for each of the work areas 220, 225 and 235, which allow the user to select from among a plurality of display formats (corresponding to appropriate snap-in tools) for the relevant object in each work area. Other examples can include inputs generating a contextual work area, not shown, for the secondary work area 235. If such other inputs have been received, then the CPU 152 proceeds to step 446 to process the inputs in an appropriate manner. The CPU
152 thereafter proceeds to step 448. If no other inputs have been received, then the CPU 152 proceeds directly to step 448.
In step 448, the CPU 152 determines whether the user has selected a new selection from the selection area 215. If so, then the CPU 152 proceeds to step 414 to set CUR OBJ to the new value. As discussed above, the CPU 152 after step 414 executes step 416 and proceeds accordingly. If, however, the CPU 152 determines in step 448 that it has not received a new selection from the selection area 215, then the CPU 152 returns to step 424 and proceeds accordingly.
As discussed above, one of the features of the operations of Figs. 4A and 4B is the determination of the related items for a selected object within the primary work area 220, and the display of selectable icons or other information corresponding thereto, in the related items area 230. Fig. 7 shows an exemplary set of operations that may be performed by the CPU 152 in determining the related items corresponding to the object CONT OBJ, represented as step 422 of Figs. 4 A and 4B.
In step 705, the CPU 152 determines all the schedule files that include any points identified with or associated with the CONT OBJ. To this end, as discussed further above in connection with Fig. 11, each data object record 1105, 1110 may have associated with it points or active system data values. Such points may include control points, such as set points or other command values, that are adjusted according to one or more schedules defined by a schedule file. A schedule file can include the schedule files 1015 in the database 1242, or can include schedules executing separately on one or more controller devices of the devices 102, 104 and 106. In step 705 the CPU 152 determines all schedules associated with the point properties of the selected contextual object CONT OBJ and displays information, such as a selectable icon, representative of those schedules in the related items area 230.
To this end, the CPU 152 reviews a stored output file (in the data model 1242) that lists all schedule files associated with each point of the system 50. In the embodiment, described herein, the stored output file is a table of points. The table entry for each point lists a set of schedule files that correspond to the point.
For example, Fig. 9 shows an exemplary relationship finder output file. In that output file, which may suitably be stored in the primary memory 160 of the system 100, three points TEMP SP 03, TEMP SP 08 and CHILL PWR are listed at table entries. For each of the table entries, there is a set of schedule identifiers, some subset of identifies SCH. 1, SCH. 2, SCH. 3, and/or SCH. 4, that affect the point or value in question.
Referring again to Fig. 7, in step 705, the CPU 152 determines the schedules listed in the stored output file for all point properties of the CONT OBJ. The CPU 152 furthermore causes the related items area 230 to include selectable icons to links to such schedules.
Thereafter, in step 710, the CPU 152 generates a link to new standard reports for the CONT OB J base on the object type of the CUR OBJ. In particular, each object type has a set of predetermined standard reports, for example, trend reports, that can maintained by system 100. The CPU 152 identifies the standard reports for the CONT OBJ based on its object type. The CPU 152 furthermore causes the related items area 230 to include selectable icons to links to new standard reports. The user may subsequently select such icons to set up new reports involving the object CONT OBJ.
Thereafter, step 715, the CPU 152 causes a selectable icon to a link to a new, unidentified report selection to be displayed in the related items area 230. The user may select this icon to generate some other report involving the object CONT OBJ. In step 720, the CPU 152 identifies and displays information for any static related items for the object CUR OBJ. Specifically, the CPU 152 determines static related items based the data record (e.g. 1105, 1110) for the object CONT OBJm the database 1242. To this end, CPU 152 reviews a predetermined set of properties in the configuration data (i.e. the related items property <RI> of object records 1105, 1110) for the CONT OBJ to identify any static related items. These static properties can, for example, include a link to a video feed from a video camera, existing reports, and the like. The CPU 152 further causes the related items area 230 to include selectable icons to links to such static related items.
Thus, the operations of Fig. 7 show how the CPU 152 may generate selectable related items for an object already displayed in other areas 220 and/or 225. These related items include those defined as properties for the object, as well as those dynamically created based on system data (such as schedules) and/or to allow new objects (i.e. reports) to be generated.
Fig. 8, as discussed above, shows the operations that may be performed by the CPU 152 (or another processing unit) to generate a relationship finder output file that relates data points to schedules. As discussed above, the relationship finder output file in this embodiment is a table of points. For each point in the table, a list of schedules that involve or relate to the point is stored. (See Fig. 9). The CPU 152 performs the steps of Fig. 8 to generate such a table.
In step 805, the CPU 152 selects a schedule file from the schedule files 1015 that has not yet been processed. Alternatively, or in addition, the CPU 152 obtains one or more schedules maintained on controllers of the devices 102, 104 and 106. In step 810, the CPU 152 selects a point that is identified within the selected schedule, and has not yet been processed. In step 815, the CPU 152 determines whether a table entry exists for the selected object. If so, the CPU 152 proceeds directly to step 825. If not, then the CPU 152 in step 820 creates a table entry for the selected object, and then proceeds to step 825. In step 825, the CPU 152 stores an identification of the selected schedule in the table entry for the selected point. In this manner, any subsequent look-up of the selected point (per step 705 of Fig. 7) on the table will identify, among other things, the particular schedule currently being processed. After step 825, the CPU 152 proceeds to step 830.
In step 830, the CPU 152 determines whether all points in the selected schedule have been processed. If so, then the processing of the selected schedule is complete and the CPU 152 proceeds to step 835. If not, then the processing of the selected schedule is not complete and the CPU 152 returns to step 810 to select and process another object of the selected schedule.
In step 835, the CPU 152 determines whether all of the schedules maintained by the system 100 have been processed. If so, then the CPU 152 has completed the relationship finder output file and ends the process. If not, then the CPU 152 returns to step 805 to select and process another schedule.
It will be appreciated that the above-described embodiments are merely exemplary, and that those of ordinary skill in the art may readily develop their own implementations and modifications that incorporate the principles of the present invention and fall within the spirit and scope thereof.

Claims

What is claimed is:
1. An arrangement for use in a building automation system, comprising:
a memory storing programming instructions and a hierarchical definition of at least a portion of the building automation system;
a display;
a user input device;
a processing circuit operably coupled to the memory and the display, the processing circuit configured, when executing the programming instructions, to
display at least a portion of the hierarchical definition in a first portion of the display, displayed portion comprising a plurality of building automation system objects;
receive a user input signal identifying a first of the plurality of building automation system objects;
obtain a first set of object data regarding the first building automation system object from one or more data records associated with the first building automation system object;
display information regarding the first set of object data in a second portion of the display;
review system data associated with the first set of object data to determine a set of related objects;
display information regarding the set of related objects in a third portion of the display.
2. The arrangement of claim 1, wherein the first set of object data includes an element corresponding to a second building system object that is related to the first building system object.
3. The arrangement of claim 2, wherein the processing circuit is configured, when executing the programming instructions, to review system data associated with the first set of object data to determine the set of related objects by reviewing system data associated with the second building system object.
4. The arrangement of claim 3, wherein the memory further stores schedules for a plurality of building automation system devices, wherein the processing circuit is further configured, when executing the programming instructions, to:
review the schedules to determine a set of schedules associated with the second building automation system object;
add information corresponding to each schedule of the set of schedules to the determined set of related objects.
5. The arrangement of claim 3, wherein the processing circuit is further configured, when executing the programming instructions, to:
obtain a type identifier for the second building automation system object;
identify a set of standard reports corresponding to the type identifier; and
add the identified set of standard reports to the determined set of related objects.
6. The arrangement of claim 1, wherein the displayed information regarding the first set of object data in the second portion of the display includes interactive elements; and wherein the processing circuit is further configured, when executing the programming instructions, to: receive a second user input relating to an operational parameter of a building automation system;
provide a command value to a building automation system device based on the operational parameter.
7. The arrangement of claim 1, wherein the processing circuit is further configured, when executing the programming instructions, to:
receive operational data relating to a first building automation system from a building automation system device, and
provide the first set of object data in the second portion of the display such that the first set of object data includes information representative of the operational data.
8. The arrangement of claim 7, wherein the operational data includes a sensor value.
9. The arrangement of claim 1, wherein:
the memory further comprises a plurality of display format data elements, each of the display format data elements defining a format for displaying object information; and
the processing circuit is further configured, when executing the programming instructions, to:
identify at least a first display format data element from the plurality of format data elements, display information regarding the first set of object data in a second portion of the display using the first display format data element.
10. The arrangement of claim 9, wherein the processing circuit is further configured, when executing the programming instructions, to identify the first display format data element based on configuration information associated with a user profile record.
11. The arrangement of claim 9, wherein the processing circuit is further configured, when executing the programming instructions, to:
obtain the first set of object data based on object data types identified in the first display format data element.
12. The arrangement of claim 9, wherein each of the display format data elements includes information on displaying object data based on object data type, such that the display format data elements may be employed by the processing circuit to display data for any of a plurality of objects having corresponding object data types.
13. The arrangement of claim 2, wherein the processing circuit is further configured, when executing the programming instructions, to:
obtain information regarding a set of schedules associated with the second building automation system object;
add information corresponding to each schedule of the set of schedules to the determined set of related objects.
14. The arrangement of claim 13, wherein the processing circuit is further configured, when executing the programming instructions, to:
obtain a data record for the second building automation system object, the data record including a plurality of properties for the second building automation system object;
add information corresponding to at least one property in the data record to the determined set of related objects.
15. The arrangement of claim 1, wherein the processing circuit is further configured, when executing the programming instructions,
to obtain a data record associated with the first set of object data, the data record including a plurality of properties at least one object;
add information corresponding to at least one property in the data record to the determined set of related objects;
identifying system data associated with the at least one object wherein the system data is other than the data record;
add information regarding the identified system data to the determined set of related objects.
16. The arrangement of claim 15, wherein the identified system data comprises at least one building automation system schedule associated with the at least one object.
17. An arrangement for use in a building automation system, comprising:
a memory storing programming instructions and a plurality of data records corresponding to building automation system objects;
a display; a user input device;
a processing circuit operably coupled to the memory and the display, the processing circuit configured, when executing the programming instructions, to
obtain a data record corresponding to the first building automation system object; display information regarding the first building automation system object using the data record in a first portion of the display;
add information corresponding to at least one property in the data record to a set of related objects;
identify system data associated with the first building automation system object wherein the system data is other than the data record;
add information regarding with the identified system data to the set of related objects;
display information representative of the set of related objects in a second area of the display; and
at least a first building automation system device operably coupled to the processing circuit, and wherein the processing circuit is further configured to provide signals altering the operation of the first building automation system device.
18. The arrangement of claim 17, wherein the identified system data comprises at least one building automation system schedule associated with the first building automation system object.
19. The arrangement of claim 17, wherein the processing circuit is further configured, when executing the programming instructions, to: receive operational data relating to a first building automation system from a building automation system device, and
display information regarding the received operational data in the first portion of the display.
20. The arrangement of claim 19, wherein the processing circuit is further configured, when executing the programming instructions, to:
processing circuit is further configured the display information regarding the first building automation system object includes a sensor value generated by the first building automation system object.
PCT/US2011/054141 2011-09-30 2011-09-30 Management system with versatile display Ceased WO2013048427A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
CN201180073781.1A CN103827758B (en) 2011-09-30 2011-09-30 A kind of for the equipment in building automation system
PCT/US2011/054141 WO2013048427A1 (en) 2011-09-30 2011-09-30 Management system with versatile display
US13/538,275 US8933930B2 (en) 2011-09-30 2012-06-29 Navigation and filtering with layers and depths for building automation graphics
US13/537,975 US9542059B2 (en) 2011-09-30 2012-06-29 Graphical symbol animation with evaluations for building automation graphics
US13/538,073 US8854202B2 (en) 2011-09-30 2012-06-29 Unified display of alarm configurations based on event enrollment objects
US13/537,911 US20130086066A1 (en) 2011-09-30 2012-06-29 Automated discovery and generation of hierarchies for building automation and control network objects
US13/538,242 US9519393B2 (en) 2011-09-30 2012-06-29 Management system user interface for comparative trend view
US13/538,182 US9170702B2 (en) 2011-09-30 2012-06-29 Management system user interface in a building automation system
PCT/US2012/057063 WO2013049031A1 (en) 2011-09-30 2012-09-25 Unified display of alarm configurations based on event enrollment objects
PCT/US2012/057240 WO2013049138A1 (en) 2011-09-30 2012-09-26 Management system user interface in a building automation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/054141 WO2013048427A1 (en) 2011-09-30 2011-09-30 Management system with versatile display

Related Child Applications (5)

Application Number Title Priority Date Filing Date
US13/537,975 Continuation-In-Part US9542059B2 (en) 2011-09-30 2012-06-29 Graphical symbol animation with evaluations for building automation graphics
US13/538,275 Continuation-In-Part US8933930B2 (en) 2011-09-30 2012-06-29 Navigation and filtering with layers and depths for building automation graphics
US13/538,073 Continuation-In-Part US8854202B2 (en) 2011-09-30 2012-06-29 Unified display of alarm configurations based on event enrollment objects
US13/538,182 Continuation-In-Part US9170702B2 (en) 2011-09-30 2012-06-29 Management system user interface in a building automation system
US13/538,242 Continuation-In-Part US9519393B2 (en) 2011-09-30 2012-06-29 Management system user interface for comparative trend view

Publications (1)

Publication Number Publication Date
WO2013048427A1 true WO2013048427A1 (en) 2013-04-04

Family

ID=44789636

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/054141 Ceased WO2013048427A1 (en) 2011-09-30 2011-09-30 Management system with versatile display

Country Status (2)

Country Link
CN (1) CN103827758B (en)
WO (1) WO2013048427A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180321644A1 (en) * 2017-05-02 2018-11-08 Siemens Industry, Inc. Smart replay in management systems

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105095564B (en) * 2015-06-18 2018-05-08 沈阳恩派工程技术咨询有限公司 data processing method and device based on building information model
GB201711478D0 (en) * 2016-10-12 2017-08-30 Qcic Ltd Building control systems
AU2020402997B2 (en) * 2019-12-10 2023-11-30 Honeywell International Inc. Hierachical building performance dashboard with key performance indicators alongside relevant service cases

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1515289B1 (en) 2003-09-15 2008-03-05 Siemens Aktiengesellschaft User interface for a control station
WO2008040455A1 (en) * 2006-10-06 2008-04-10 Tac Ab Data structure & associated method for automation control system management
EP2073086A1 (en) * 2007-12-20 2009-06-24 Tac AB Method for generating documentation for a building control system
EP2246759A2 (en) * 2009-04-27 2010-11-03 Fisher-Rosemount Systems, Inc. Configuring animations and events for operator interface displays in a process control system
EP2343642A1 (en) * 2009-12-18 2011-07-13 Schneider Electric Buildings AB User interface panel

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6208905B1 (en) * 1991-12-20 2001-03-27 Honeywell International Inc. System and method for controlling conditions in a space
US5572438A (en) * 1995-01-05 1996-11-05 Teco Energy Management Services Engery management and building automation system
US6021403A (en) * 1996-07-19 2000-02-01 Microsoft Corporation Intelligent user assistance facility
CN1223428A (en) * 1997-07-17 1999-07-21 兰迪斯及斯特法有限公司 Method and apparatus for monitoring and controlling real-time information in building automation system
US7313819B2 (en) * 2001-07-20 2007-12-25 Intel Corporation Automated establishment of addressability of a network device for a target network environment
US8055787B2 (en) * 2004-09-10 2011-11-08 Invensys Systems, Inc. System and method for managing industrial process control data streams over network links
US20070282993A1 (en) * 2006-06-02 2007-12-06 Teletrol Systems Inc. Distribution of system status information using a web feed
EP2206041A4 (en) * 2007-10-01 2011-02-16 Iconics Inc VISUALIZATION OF PROCESS CONTROL DATA

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1515289B1 (en) 2003-09-15 2008-03-05 Siemens Aktiengesellschaft User interface for a control station
WO2008040455A1 (en) * 2006-10-06 2008-04-10 Tac Ab Data structure & associated method for automation control system management
EP2073086A1 (en) * 2007-12-20 2009-06-24 Tac AB Method for generating documentation for a building control system
EP2246759A2 (en) * 2009-04-27 2010-11-03 Fisher-Rosemount Systems, Inc. Configuring animations and events for operator interface displays in a process control system
EP2343642A1 (en) * 2009-12-18 2011-07-13 Schneider Electric Buildings AB User interface panel

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180321644A1 (en) * 2017-05-02 2018-11-08 Siemens Industry, Inc. Smart replay in management systems
US10768587B2 (en) * 2017-05-02 2020-09-08 Siemens Industry, Inc. Smart replay in management systems

Also Published As

Publication number Publication date
CN103827758B (en) 2016-10-05
CN103827758A (en) 2014-05-28

Similar Documents

Publication Publication Date Title
EP2574999B1 (en) Management system using function abstraction for output generation
US11868104B2 (en) Dashboard and button/tile system for an interface
US8193917B2 (en) Arrangement for the propagation of alarm information in a building automation system that includes one or more applications that access building system data via a monitoring and control system
US8933930B2 (en) Navigation and filtering with layers and depths for building automation graphics
US10955801B2 (en) HVAC information display system
US10728053B2 (en) System and method for remote monitoring and controlling of building automation devices
US9274684B2 (en) Hierarchical navigation with related objects
US9946233B2 (en) Apparatus and methods for providing building automation system data updates to a web client
US9519393B2 (en) Management system user interface for comparative trend view
US9542059B2 (en) Graphical symbol animation with evaluations for building automation graphics
US10019129B2 (en) Identifying related items associated with devices in a building automation system based on a coverage area
US8854202B2 (en) Unified display of alarm configurations based on event enrollment objects
WO2013048427A1 (en) Management system with versatile display
US20240134366A1 (en) Building management system with intelligent fault visualization
US20240194054A1 (en) Building management system with intelligent visualization for fire suppression, fire prevention, and security integration

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11768250

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11768250

Country of ref document: EP

Kind code of ref document: A1