[go: up one dir, main page]

WO2015051816A1 - Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus - Google Patents

Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus Download PDF

Info

Publication number
WO2015051816A1
WO2015051816A1 PCT/EP2013/070777 EP2013070777W WO2015051816A1 WO 2015051816 A1 WO2015051816 A1 WO 2015051816A1 EP 2013070777 W EP2013070777 W EP 2013070777W WO 2015051816 A1 WO2015051816 A1 WO 2015051816A1
Authority
WO
WIPO (PCT)
Prior art keywords
communication
participants
process control
media item
session
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2013/070777
Other languages
English (en)
Inventor
Elina Vartiainen
Martin Naedele
Mark Billinghurst
Seungwon Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ABB Technology AG
Original Assignee
ABB Technology AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ABB Technology AG filed Critical ABB Technology AG
Priority to PCT/EP2013/070777 priority Critical patent/WO2015051816A1/fr
Publication of WO2015051816A1 publication Critical patent/WO2015051816A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0428Safety, monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/24Pc safety
    • G05B2219/24048Remote test, monitoring, diagnostic
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39449Pendant, pda displaying camera images overlayed with graphics, augmented reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P80/00Climate change mitigation technologies for sector-wide applications
    • Y02P80/40Minimising material used in manufacturing processes

Definitions

  • the present invention generally relates to process control systems. More particularly the present
  • invention relates to a method, communication handling arrangement and computer program product for
  • Maintenance and repair of process equipment may thus be a critical part of production in order to ensure that the production is up and running at all times. Even a minor halt in production may waste great amounts of resources and cost large amounts of money.
  • external experts are needed to perform complex and advanced repair and testing work on the process equipment. In those cases, maintenance can become very expensive if these experts are brought in to the site, which might be located on the other side of the world. The regular personnel might not have the expertise to handle by themselves. Collaboration over a telephone line between the local worker and the expert is not effective enough in many cases. The expert may need to see what happens on site and may need to be able to instruct the personnel on site without the risk of any misinterpretation.
  • EP 1657610 does for instance describe an asset management system where video and snapshots by a remote and local user are provided in a communication session. The document also discusses the presentation of circuit diagrams. However, there is still room for improvement within the field .
  • the present invention is concerned with the problem of simplifying for a remote user to give instructions to a local user of a process control system.
  • This object is according to a first aspect achieved through a method for controlling a communication session for a number of participants, where the
  • participants comprise a local user at a location in an industrial site where a process control system is operated and a remote user at a remote location, the communication session comprising a first media item in the form of video captured by the local user, the method being performed by a communication handling arrangement and comprising:
  • representations are representations of process control objects present in the communication of the first media item
  • This object is according to a second aspect achieved through a communication handling arrangement for controlling a communication session for a number of participants, where the participants comprise a local user at a location in an industrial site where a process control system is operated and a remote user at a remote location, the communication session comprising a first media item in the form of video captured by the local user and the communication handling arrangement comprising :
  • a communication handling unit configured to:
  • representations are representations of process control objects present in the communication of the first media item
  • This object is according to a third aspect achieved through a computer program product controlling a communication session for a number of participants, where the participants comprise a local user at a location in an industrial site where a process control system is operated and a remote user at a remote location, the communication session comprising a first media item in the form of video captured by the local user, said computer program product being provided on a data carrier comprising computer program code the computer program product comprising a data carrier with computer program code which when run in a communication handling unit of the communication handling arrangement is configured to cause the communication handling unit to
  • representations are representations of process control objects present in the communication of the first media item
  • the invention has a number of advantages.
  • the invention improves the ability to provide event specific and/or request dependent instructions from an expert in a fast an efficient way from virtually any place in the world without requiring travel. Thereby the downtime of a process control system can be considerably reduced. This results in cost savings. Also, safety is improved as issues can be fixed faster and more accurately.
  • Fig. 1 schematically shows an industrial plant with a process control system operating an industrial process together with a data presenting device
  • Fig. 2 schematically shows a block schematic of units inside a housing of the data presenting device
  • Fig. 3 shows a perspective view of the data presenting device in the form of the housing on a tripod
  • Fig. 4 schematically shows the data presenting device communicating with a computer of a remote user in a communication session controlled by a communication handling device of the process control system
  • Fig. 5 schematically shows the remote user with his computer on which video captured by the data presenting device is shown
  • Fig. 6 schematically shows a flow chart of method steps for mapping of images captured by the data presenting device to a model of the environment in which they are captured
  • Fig. 7 schematically shows a flow chart of method steps for associating real world data of a real world process control object to a graphical representation of the same
  • Fig. 8 schematically shows a session presentation area in which data of a communication session is displayed for a session participant
  • Fig. 9 schematically shows a flow chart of a method for controlling a communication session
  • Fig. 10 schematically shows a data carrier
  • This invention presents a way for a remote user to gather relevant data and provide instructions and directions for local engineers at a location of an industrial plant where a process control system
  • Fig. 1 schematically shows an industrial plant where a process control system 10 is provided.
  • the process control system 10 is a computerized process control system for controlling an industrial process.
  • the process can be any type of industrial process, such as electrical power generation, transmission and
  • a process may be monitored through one or more process monitoring computers, which communicate with a server handling monitoring and control of the process.
  • the process control system 10 therefore includes a number of process monitoring computers 12 and 14. These computers may here also be considered to form operator terminals and are connected to a first data bus Bl . There is also a communication handling device 16 connected to this first data bus Bl, which communication handling device 16 is connected to at least one wireless network WN. The communication handling device 16 is also connected to a public data communication network, which here is the internet IN.
  • the communication handling device 16 therefore has a wireless interface 17 for communicating with the wireless network WN, a computer network interface 18, for instance in the form of an Ethernet interface, for communicating with the Internet and a local interface 19, perhaps also in the form of an Ethernet interface, for communicating with other entities in the process control system 10.
  • the communication handling device 16 also has a communication handling unit 20 connected to all three interfaces 17, 18 and 19. The wireless interface 17 is thereby a first communication interface for communicating with the data presenting device 32, the computer network interface 18 a second
  • the communication handling unit 20 has the function of providing a gateway between the process control system and other networks, such as the Internet or the wireless network WN, if this is external to the process control system 10. However, in some variations of the invention it also has another function and that is the function of controlling a communication session. This will be described in more detail later.
  • the wireless network WN may be a local network, such as a wireless local area network (WLAN) . It may also be a Bluetooth network, i.e. a network with a number of interconnected Bluetooth nodes. It may also be a mobile communication network.
  • WLAN wireless local area network
  • Bluetooth network i.e. a network with a number of interconnected Bluetooth nodes. It may also be a mobile communication network.
  • a second data bus B2 and between the first and second data busses Bl and B2 there is connected a server 23 providing control and protection of the process and a database 22 where data relating to control and protection of the process is stored.
  • data relating to control and protection may here comprise process data such as measurements and control commands, while data relating to protection may comprise alarm and event data as well as data on which alarms and events can be generated, such as
  • the object data server 21 comprises data about all process control objects, such as blueprints, instructions and manuals regarding the process control objects. It may also comprise face plates of process control objects, which face places may be set to access and communicate process control data from the process control object.
  • further devices 24, 26, 28 and 30 are field devices, which are devices that are interfaces to the process being controlled.
  • a field device is typically a device comprising an interface via which measurements of the process are being made and to which control commands are given. Because of this the field devices are furthermore process control objects.
  • a first field device is a first process control object 24, a second field device is a second process control object 26 and a third field device is a third process control object 28.
  • the communication handling unit 20 is a part of a communication handling arrangement.
  • This communication handling arrangement does in the variation in fig. 1 comprise the communication handling device 16.
  • the communication handling arrangement is the communication handling device 16.
  • the data presenting device 32 is a part of the
  • control and protection server 23, the database 22 and/or object data server 21 may also be included in the communication handling arrangement.
  • the communication handling unit may as an alternative be provided in the data presenting device.
  • the communication handling arrangement only comprises the data presenting device 32.
  • the communication handling arrangement may also comprise the control and protection server 23, the database 22 and/or object data server 21.
  • Fig. 2 shows a block schematic of a number of units that are provided in an exemplifying data presenting device 32.
  • the exemplifying data presenting device 32 is provided with a housing 49.
  • a bus 33 In the housing 49 there is provided a bus 33, and to this bus 33 there is connected an optional short range communication unit 46 or proximity sensor, a display 48, a camera 34, a recording controller 36, a program memory 39, a
  • the radio communication circuit 42 is furthermore connected to an antenna 44, where the radio
  • the radio communication circuit 42 and antenna 44 are provided for communication with the wireless network WN.
  • the radio communication circuit 42 and antenna 44 together form one type of communication interface for communicating with the process control system as well as with other entities. It may for this reason be a WiFi or WLAN interface. It may also be a mobile communication interface. It should also be realized that there may be two communication interfaces in the data presenting device 32, one mobile communication interfaces and one WiFi interface.
  • the recording controller 36 is in turn connected to a microphone 35.
  • the recording controller 36 and microphone 35 together form a recording unit that may be used for recording sound in a location of the process control system.
  • the data presenting device 32 may also comprise sound emitting units such as speakers and earphones. It is also possible that a microphone and earphones are combined into a headset connected to the data
  • the short range communication unit 46 may also be regarded as a type of sensor, an object sensor or proximity sensor, for sensing a process control object to be serviced. This sensor may be implemented through Near Field Communication (NFC) technique .
  • NFC Near Field Communication
  • control unit 38 In the program memory 39 there is provided software code which when being run by the processor 40 forms a control unit 38.
  • a local user equipped with the data presenting device 32 may move within the premises of the industrial plant. This means that the data presenting device 32 may be moved from one location to another location.
  • the data presenting device may also be considered to be a portable video recording device. It may also be placed so that it will be able to capture video images. For this reason the housing 49 may be placed on an optional tripod 50, which is schematically shown in fig. 3.
  • the camera 34 has a field of view, i.e. an area in which it detects its environment. This field of view may be changed in different ways. It may be increased through zooming out commands and it may be decreased through zoom in commands. The field of view may also be shifted or moved using various type of pan commands. In order to obtain panning, the orientation of the camera may be changed. It is of course also possible to omit the tripod and the let the local user carry the data presenting device in his or her hands.
  • the data presenting device may be moved from one location to another location.
  • the data presenting device may also be considered to be
  • the data presenting device may also be provided on wheels so that it can more easily be moved.
  • the data presenting device may also be a tablet or a smart phone carried by the local user.
  • the data presenting device 32 may connect to the communication handling device 20 via the wireless network WN.
  • the communication handling device 20 may also be accessed from outside of the process control system via the Internet IN. This allows the user of the data presenting device 32 to engage in a communication session with at least one other user, which other use may be a remote user. It is then possible that video captured by the data presenting device 32 is being communicated to other participants of the communication session and especially to the remote user.
  • the communication session will then be set up between the data presenting device 32 and a computer of the remote user via the communication handling device 16, thereby making the local user of the data presenting device 32 and the remote user into
  • a computer 51 of the remote user may communicate with the data presenting device 32 via the communication handling device 16.
  • the communication of the session thus passes through the communication handling device 16, which allows the communication handling device 16 to control the communication
  • the session comprises a number of channels between the session participants and the communication handling device 16.
  • the primary communication channels CHP1 and CHP2 may be unidirectional video channels in which a first media item in the form of a video stream VS is transferred from the data presenting device 32 of the local user to other session participants and in particular to the remote user computer 51.
  • bidirectional, are used for transferring of other media items as well as data used for controlling the
  • the secondary communication channels CHS1 and CHS2 may for instance be used for conveying processor graphics PG, a marker M and user inputs UI .
  • the data in the primary channels CHP1 and CHP2 may be transparently passed through the
  • the communication handling device 16 i.e. passed without the communication handling unit 20 acting on the data, while the communication handling unit 20 may actively act on at least some of the data of the secondary communication channels CHS1 and CHS2.
  • the remote 52 user may be able to obtain video images captured by the camera 34 of the data presenting device 32, which video images are then communicated to the remote user 52 and other session participants via the display of his or her computer 51, which situation is shown in fig. 5.
  • the video may also be communicated to the local user via the data presenting device 32.
  • fig. 6 schematically shows a flow chart of method steps for mapping of images captured by the data presenting device to a model of the environment in which they are captured
  • fig. 7 schematically shows a flow chart of method steps for associating real world data of a real world process control object to a graphical representation of the same.
  • the remote user may also provide additional data of the scene shown in such a captured video stream, such as provide information about process control objects being shown in the video as well as provide further guidance to the local user in order to allow the local user to perform maintenance for example without making time-consuming mistakes, or wrongly identifying a process object as causing an abnormal process, and generally as effectively as possible.
  • the data presenting device 32 In operation, i.e. when there is some kind of problem at a location in the plant, the data presenting device 32 is brought out to this location of the industrial site and placed at a position in the location where assistance is needed.
  • the device may for instance be placed in the centre of a room.
  • the data presenting device may be placed, such as on the tripod 50, at this location by the local user in order to be used for solving a problem at the location, for instance the fact that one or more of the machines or process control objects may be faulty or that the process has a strange behaviour at the location.
  • the data presenting device may be held by the local user at the location.
  • control unit 38 of the data presenting device 32 may first make the data presenting device 32 scan the area at the
  • step 54 fig. 6.
  • a three- dimensional space around the position of the data presenting device may be captured with different video images using the camera 34.
  • the images may thus be forwarded to the communication handling unit 20 of the communication handling device 16, which communication handling unit 20 analyses the captured images and investigates if it recognises them with regard to a pre-existing three-dimensional model of the location and objects at this location, i.e. of the process control objects and possible other objects present at the location. If it recognizes the video images and that therefore there is a pre-existing model, step 56, then this model is fetched, step 58.
  • the pre-existing three-dimensional model may be provided in the
  • the model may be obtained or fetched from a server, such as the object data server 21. If any three-dimensional model has been made of the location, then this is thus fetched. However, if there was no pre-existing model, step 56, a new three-dimensional model 3DM of the location and the various objects in it is created, step 60.
  • a model may for instance be created using augmented reality functionality. If the data presenting device comprises an infrared sensor it is also possible to use infrared technology, such as Microsoft Kinect. A 3D map of natural features at the location can be built using a variety of feature extraction methods such as corner or edge detection both with 2D RGB data and 3D RGBD (Red, Green, Blue, Depth) data.
  • the model may be created in the control unit 38 of the data presenting device 32. As an
  • the captured images may be forwarded from the data presenting device 32 to the communication handling unit 20 of the communication handling device 16, which unit 20 then creates the model.
  • the process control objects at the site i.e. the real world objects, which objects may be objects such as valves, motors or other actuators, sensors or measuring instruments, controllers or other control units comprising a processor, may be provided with object identifiers, such as NFC tags or bar codes. If the object identifiers are read it is possible to obtain information about what types of objects they are, for example actuators or sensors or control units. The type may be identified through the camera 34 of the data presenting device 32 detecting a visual object
  • the short-range communication unit may be set to read a tag with the object identifier. Once such a code is detected it may then be sent from the data presenting device 32 to the communication handling unit 20 of the communication handling device 16. These codes may then be used to identify the process control objects, step 62. Such a code may thereby be used to associate the real process control object with a corresponding object in the 3D model and thereby it is possible to known what process control objects are present in a video stream that is captured by the data presenting device. When a process control object has been identified in this way, it is then possible to subscribe to real time data of the process control object.
  • a subscription to real time data for real world process control object may thus be initiated by the communication control unit 20, step 64, which subscription may be activated at a later stage, for instance when a session is on-going.
  • a subscription to e.g. real time data for the specific process object may be established with the control system.
  • the real time data is further
  • step 65 associated with a graphical representation of the process control object, step 65. This association is furthermore made in order to enable joint display of the real time data and the graphical representation of a real world object
  • the real time data may later be made available to the communication session and may be communicated to more than one user.
  • the above mentioned steps may have been performed before the local user starts a communication session with the remote user 52 and optionally also with other users .
  • Such a session may be carried out using a TCP
  • communication session may furthermore be set up using encryption and/or a virtual private network (VPN) .
  • VPN virtual private network
  • fig. 8 shows a session presentation area in which data of a communication session is displayed for a session participant
  • fig. 9 shows a flow chart of a method for controlling a communication session.
  • a communication session is set up between a number of participants, where the local user at a location in the industrial site where the process control system 10 is operated is one and the remote user 52 at the remote location is another.
  • the communication handling device 16 provides two channels between itself and each user, where a primary channel is a video channel for the video captured by the data presenting device 32 and conveyed as a video stream VS and a secondary channel is provided for process
  • the primary channel may be a unidirectional channel in which the video is transmitted from the local user towards the other participants of the session, while the secondary channels may be bidirectional channels allowing the communication handling unit to transmit data to and from every participant.
  • the session also involves transmission of a live video stream VS, which may thus be a one way video stream from the data presenting device 32 in the process control system to the other participants of the session and in particular to the computer 51 of the remote user 52. In some instances it may involve a two-way video conference, i.e. a situation where video is also provided by the computer 51 of the remote user 52 and conveyed to the data presenting device 32.
  • the video stream is conveyed via the primary channels CHP1 and CHP2.
  • Video images captured by the camera 34 may in this way be
  • a session may be started by a user of the data
  • control unit may comprise a communication session client, which connects to the communication handling unit 20 via the wireless communication network WN using the radio communication circuit 42 and antenna 44 when the local user wants to engage in a communication session.
  • control unit 38 may send a request for a session and when receiving such a request, the communication handling unit 20 of the communication handling device 16 may then set up the primary communication channels CHPl and CHP2 to the various participants as well as the secondary
  • the communication handling unit 20 thus sets up the first primary channel CHPl to the data presenting device 32 and the second primary channel CHP2 to the computer 51 of the remote user 52.
  • the communication handling unit 20 also sets up the first secondary channel CHS1 to the data presenting device 32 and the second secondary channel CHS2 to the computer 51 of the remote user 52.
  • the first primary and secondary channels are set up by the data presenting device 32 to the communication handling device 16, which then sets up the rest of the primary and secondary channels.
  • secondary channel is set up to all the participants of the session, and especially to the remote user.
  • the remote user may also desire other participants to engage in the communication session, why communication channels may be set up to these as well.
  • the primary communication channels CHPl and CHP2 are provided for a first media item Mil, which is provided through the video stream VS, and the secondary communication channels for other data, such as for a second MI2 and further media items.
  • the video captured by the local user using the data presenting device 32 is provided by the communication handling unit 20 to all participants as a video stream VS, step 66.
  • the video may then be
  • the video stream VS is thus conveyed via the first primary channel CHP1 to the communication handling device 16, which forwards it in the second primary channel CHP2 to the computer 51 of the remote user 52.
  • the video stream VS is thus sent from the data presenting device 32 of the local user to the computer 51 of the remote user 51 as well as to any other party interested in viewing the video. This video is then communicated as the first media item Mil, fig. 8. As all participants have the same session
  • the first media item Mil is also communicated to the local user locally at the data presenting device 32. It should here be realized that it is optionally possible with also a video stream being transmitted from the remote user to the local user. If the camera view is shown on the display of the data presenting device, it is here possible that at least some of the various session presentation area items are provided on top of this camera view. As the camera view depicts the scene of the captured video, it is clear that the first media item may be omitted.
  • the marker M may also be provided to all participants via the secondary communication channels CHSl and CHS2. The marker M is thus provided from the communication handling unit 20 of the communication handling device 16 in the first secondary communication channel CHSl to the data presenting device 32 and in the second
  • the marker M may be sent together with
  • Each participant is thus provided with a device having a screen where the presentation area SPA is shown and in this presentation area the video and the marker M are presented.
  • the marker which is selectable by all participants of the communication session, may be used for pointing at process control objects in media items being presented.
  • the communication handling unit 20 investigates if there are any user inputs via the secondary communication channels, which user inputs may be made by any of the session participants.
  • a user input may for instance be a user inputs via the marker M, which is common to all participants. If no one selects an object in the video stream, the
  • communication handling unit 20 returns and waits for an object to be selected.
  • the communication handling unit 20 investigates which object it was. Any of the participants of the session, and especially the remote user, may thus want to obtain some more data about the process control objects that are visible in the video stream VS.
  • the objects may have been identified using a virtual model, an object may easily be identified through a participant pointing at the representation of the object in the presented video, for instance using the marker M.
  • a participant may for instance desire to obtain data of the temperature in a tank or the voltage of a transformer. In order to do this he may then select an object in the presented video. The selection of the object may then be detected through the communication handling unit 20 detecting the position of the marker in relation to the
  • this position is the position of an object in the 3D model for which an association to a corresponding process control object exists
  • data of this process control object is then fetched, for instance displayed in one or more face plates from the server 21.
  • the data being fetched may also comprise the above-mentioned real time data for which a subscription exists.
  • a face plate may be able to display a number of different types of real time data, why it may also be possible for the session participants to choose between different types of real time data to be communicated via a face plate of a process control object It is also possible that if a selection is detected, then all process graphics of all the process control objects covered in the video are fetched. Process graphics of all the process control objects shown in the video may thus be fetched automatically through a selection of the first media item Mil as such. When a process object has been identified then the real time data of the process control object may be thus be fetched from the process control system according to the subscription. The real time data is then made available to the communication session and may be communicated to more than one user. It is also possible that no user inputs are needed.
  • Process graphics may be fetched automatically by the communication handling unit 20 based on the objects being covered by the camera of the data presenting device 32. It can in this way be seen that the communication handling unit 20 fetches graphical representations of at least one process control object, which graphical representations are representations of process control objects present in the presentation of the first media item, step 68.
  • the fetching may in some instances furthermore be made based on the communication handling unit receiving a selection of the first media item from one of the participants.
  • the selection may furthermore be the selection of at least one of the process control objects shown in the presentation of the first media item. It can also be seen that the above mentioned user inputs may comprise participant selections made in relation to media items and/or process control objects.
  • the process graphics PG of one or more process control objects are then transferred in a second media item MI2 in the secondary communication channels CHS1 and CHS2 to all session participants.
  • the real time data being associated with the process graphics may be transferred.
  • the real time data may be transferred as a part of the second media item MI2 or separately.
  • the graphical representations are thereby provided in the second media item MI2, step 69, for being communicated to the participants of the session.
  • the second media item may comprise a graphical representation of the selected process control object.
  • the second media item MI2 is furthermore provided to all the participants in the communication session.
  • first media item Mil i.e. the video of the video stream VS
  • second media item MI2 with process graphics PG.
  • first piece of process graphics PG1 representing a first process control object
  • second piece of process graphics PG2 representing a second process control object
  • third piece of process graphics PG3 representing a third process control object
  • fourth piece of process graphics PG4 representing a fourth process control object
  • the process graphics may here be realized in the form of face plates, which in addition to graphically representing the process control object also comprise, access or subscribe to the real-time data of the process control objects.
  • the communication handling unit 20 may also fetch measurement data from the process control system being actually measured by the process control object or being determined based on data measured by the process control object and communicate this real time data together with the process graphics of the process control object.
  • the process graphics possibly including data measured by or determined based on measurements made by the process control object being represented by a process graphic item.
  • the marker M may also be provided to all of the session participants. This marker M is also selectable by all participants. However, since there may be more than one participant, simultaneous control may have to be avoided.
  • a participant wanting to use the marker M may send a request to control the marker M, which request is thus sent as user inputs UI in a secondary communication channel.
  • the participant may select a marker control command, for instance in a menu of the presentation area.
  • the communication handling unit 20 therefore controls the communication handling unit 20.
  • step 70 investigates the user inputs UI with regard to if there are any requests for controlling of the marker M. If there is not, step 70, it returns and waits. However, if there is, step 70, the communication handling unit
  • control 20 then investigates if the control is to be granted or not . In case there is no competition for the marker, then control may be directly handed over to the participant that requests control. However, if there is contention, the communication handling unit 20 may determine which participant is to be a controlling participant
  • step 72 It thus determines which participant out of the ones that simultaneously want to be controlling participant that is to be made into a controlling participant, i.e. a person that is in control of the marker M. In doing this it employs a priority scheme.
  • the remote user is given the highest priority in this scheme with the local user having the second highest priority.
  • these may be given priority according to the order in which they join the session and with a priority that is below the priority of the local and remote users of the process control system.
  • the participant having the highest priority is then given control of the marker M, while the other participants are denied control.
  • a priority may also be given a user based on the
  • the priority may be linked to this user identity.
  • the communication handling unit 20 may thus receive at least one request for controlling the marker M from the participants of the session, and if more than one request is received from more than one participant, it may determine which participant that is to receive control according to the priority scheme and provide control to the selected participant, where the priority scheme indicates that the remote user has priority over other participants.
  • the communication handling unit 20 investigates the use of the marker. It more particularly investigates if the controlling participant marks an object or not in one of the media items. If no object is marked, step 74, then the communication handling unit 20 returns and investigates if there is a request for control. However, if an object is marked, step 74, fig. 9 the object is then emphasized in the media item in question, step 76. This may be done by adding a visual emphasis such as a graphic border or ring to an image a process object. Through this marking, the communication handling unit receives, and registers via the marked media item, input from one of the participants concerning a process control object, step 74, where this process control object is present in both media items Ml and M2.
  • the media item in which the marking of the object by the controlling participant is determined is thus changed, which change is an indication of the selection of the object by the controlling participant, and which may be communicated to other users by recording and distributing the marked up media item.
  • the form of emphasizing is optional. What is also done is that the object is emphasized in the other media item, step 78. It is also possible to use sound to indicate the selection to the other session participants. In this way the other participants are visually informed of the input via the other media item, i.e. via the non-marked media item. The object is thus emphasized in the non- marked media item.
  • the participants may also make annotations to the media items being presented in the session presentation area SPA, which annotations are also conveyed in the
  • the communication handling unit may thus receive an annotation from a participant made in relation to at least one of the media items and communicate the annotation to the participants of the session. Two such annotations are shown in fig. 8, on in the first media item Mil and one in the second media item MI2.
  • This invention presents a way to allow an expert to remotely guide personnel on site via a live data stream.
  • the system may use a mobile device/tablet with an integrated camera, for instance in the form of the previously described data presenting device, for the local user connected wirelessly to a PC for the remote user, who may be an expert, via the communication handling device.
  • the expert may be located in his office in front of the computer and the local field worker may be carrying the mobile device.
  • the local worker can then stream a live video broadcast from the site by using the camera of his mobile device. Both the expert and the local worker can make annotations on top of this video stream. They can also share still photos and process graphics values with each other.
  • the system allows both expert and remote worker to make
  • the data presenting device of the local worker or local user uses its camera to record the scene and streams the video to the computer of the remote expert.
  • the session may also comprise dual audio streams so that the expert and the local worker can communicate
  • Both the expert and the local worker or local user can draw and write on top of the video stream and the other party can see these annotations done in real-time. Both parties can also take high resolution snapshots of the video, draw on them and share these snapshots with each other. (Note: when taking still images, these can be substantially higher resolutions than the live video) . Furthermore, the participants can point out objects on the shared picture or video stream. Both sides can also initiate taking still pictures or capture video sequences and recall earlier (annotated) still
  • the communication handling unit may be provided with an image cache memory that can be
  • the expert can add pictures or video sequences from external sources into a shared cache (e.g. instructional videos or "look for this part" type of pictures) .
  • the remote expert and local worker or local user can also share a live view of the process graphics with each other through a remote access functionality of the control system. They can annotate the process graphics view in a similar way than the live video stream.
  • the remote expert can also take screenshots from the process graphics and send these screenshots to the local worker. Naturally, they can both make annotations to these screenshots and they are shared with both users.
  • the process graphics can be
  • the live video stream can also be used for guidance when the local user is moving within the plant.
  • the remote expert can then explain how the local worker should navigate within the plant.
  • the remote expert does not need to sit next to a desktop computer but he might access the tool from a mobile device as well.
  • a paint factory unexpectedly experiences a severe trouble at one of their production lines.
  • the problem is rare and technically complex, and the operators on site need support from an expert in order to restore production.
  • the operator contacts a support company where an expert in this area is available to instantly help the factory with the problem. 4. Operators discuss with the expert, and the expert instructs the operators on site to bring the data presenting device with has the client software or remote collaboration tool installed so that the remote expert can look at the problem through the live video stream.
  • the remote expert observes the situation by looking at the video stream and asking the local operators to show specific parts of the equipment. 6. Based on the information the remote expert can now point at specific devices in the video stream and instruct the operators on site how to perform certain operations to correct the problem. 7. The expert also looks at the process graphics together with the operators and annotates which values the operators should follow later on.
  • the expert may also select one object in the process graphics and then the same object will be highlighted in both the process graphics as well as in the live video in order communicate a selection and to better explain the actions that the local user is to take.
  • the communication handling unit may be provided in the form of a processor together with memory including computer program code for performing its functions.
  • This computer program code may also be provided on one or more data carriers which perform the functionality of the communication handling unit when the program code thereon is being loaded into the memory and run by the processor.
  • One such data carrier 80 with computer program code 82, in the form of a CD ROM disc, is schematically shown in fig. 10.
  • the invention can be varied in many more ways than the ones already mentioned. It should therefore be realized that the present invention is only to be limited by the following claims.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne un agencement de gestion de communication destiné à réguler une session de communication pour plusieurs participants, les participants comprenant un utilisateur local au niveau d'un emplacement d'un site industriel – où est actionné un système de régulation de processus – et un utilisateur distant au niveau d'un emplacement distant. La session de communication comprend un premier élément multimédia (MI1) sous la forme d'une vidéo capturée par l'utilisateur local, comprend une unité de gestion de communication qui fournit le flux vidéo aux participants de la session de communication pour la communication du premier élément multimédia (MI1), extrait des représentations graphiques (PG1, PG2, PG3, PG4) des objets de régulation de processus présents dans la communication du premier élément multimédia, fournit les représentations graphiques dans un second élément multimédia (MI2) pour une communication aux participants, puis reçoit, par l'intermédiaire de l'un des éléments multimédia, une entrée de l'un des participants concernant un premier objet de régulation de processus présent dans les deux éléments multimédia et communiquant visuellement aux autres participants l'enregistrement de l'entrée par l'intermédiaire de l'autre élément multimédia.
PCT/EP2013/070777 2013-10-07 2013-10-07 Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus Ceased WO2015051816A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2013/070777 WO2015051816A1 (fr) 2013-10-07 2013-10-07 Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2013/070777 WO2015051816A1 (fr) 2013-10-07 2013-10-07 Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus

Publications (1)

Publication Number Publication Date
WO2015051816A1 true WO2015051816A1 (fr) 2015-04-16

Family

ID=49378245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/070777 Ceased WO2015051816A1 (fr) 2013-10-07 2013-10-07 Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus

Country Status (1)

Country Link
WO (1) WO2015051816A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009774A1 (fr) * 2018-07-04 2020-01-09 Carrier Corporation Système de gestion de bâtiment et procédé de positionnement dans un bâtiment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1657610A2 (fr) 2004-11-12 2006-05-17 Mitsubishi Heavy Industries, Ltd. Système et procédé de gestion et d'inspection d'installations
US20080030575A1 (en) * 2006-08-03 2008-02-07 Davies Paul R System and method including augmentable imagery feature to provide remote support
WO2009036782A1 (fr) * 2007-09-18 2009-03-26 Vrmedia S.R.L. Dispositif et procédé de traitement d'informations pour une assistance technique à distance
US20090319058A1 (en) * 2008-06-20 2009-12-24 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US20100315416A1 (en) * 2007-12-10 2010-12-16 Abb Research Ltd. Computer implemented method and system for remote inspection of an industrial process
SE1300138A1 (sv) 2013-02-21 2013-02-25 Abb Technology Ltd Förfarande och datapresentationsanordning för att assistera en användare att serva ett processtyrobjekt

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1657610A2 (fr) 2004-11-12 2006-05-17 Mitsubishi Heavy Industries, Ltd. Système et procédé de gestion et d'inspection d'installations
US20080030575A1 (en) * 2006-08-03 2008-02-07 Davies Paul R System and method including augmentable imagery feature to provide remote support
WO2009036782A1 (fr) * 2007-09-18 2009-03-26 Vrmedia S.R.L. Dispositif et procédé de traitement d'informations pour une assistance technique à distance
US20100315416A1 (en) * 2007-12-10 2010-12-16 Abb Research Ltd. Computer implemented method and system for remote inspection of an industrial process
US20090319058A1 (en) * 2008-06-20 2009-12-24 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
SE1300138A1 (sv) 2013-02-21 2013-02-25 Abb Technology Ltd Förfarande och datapresentationsanordning för att assistera en användare att serva ett processtyrobjekt

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NAVAB N: "Industrial augmented reality (IAR): challenges in design and commercialization of killer apps", MIXED AND AUGMENTED REALITY, 2003. PROCEEDINGS. THE SECOND IEEE AND AC M INTERNATIONAL SYMPOSIUM ON 7-10 OCT. 2003, PISCATAWAY, NJ, USA,IEEE, 7 October 2003 (2003-10-07), pages 2 - 6, XP010662790, ISBN: 978-0-7695-2006-3, DOI: 10.1109/ISMAR.2003.1240682 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009774A1 (fr) * 2018-07-04 2020-01-09 Carrier Corporation Système de gestion de bâtiment et procédé de positionnement dans un bâtiment

Similar Documents

Publication Publication Date Title
US9628772B2 (en) Method and video communication device for transmitting video to a remote user
US9829873B2 (en) Method and data presenting device for assisting a remote user to provide instructions
US11619924B2 (en) Combined visualization thin client HMI system and method
KR101099838B1 (ko) 컴퓨터와 휴대단말기 간 영상통신을 이용한 원격 에프터 서비스방법
EP3965085A1 (fr) Solution de centre de crime en temps réel avec charges utiles de médias numériques dirigées par répartition
WO2013178248A1 (fr) Contrôle d'objets dans une installation industrielle
JP2020091530A (ja) 設備管理システム
WO2015051816A1 (fr) Régulation d'une session de communication entre un utilisateur local et un utilisateur distant d'un système de régulation de processus
AU2014346878B2 (en) Digital glass enhanced media system
KR102467017B1 (ko) 다중 사용자 간의 증강현실 커뮤니케이션 방법
CN113852517A (zh) 一种基于ar的信号强度可视化系统和方法
TWI801958B (zh) 設備維護系統及方法
CN106662849A (zh) 控制装置的移动人机接口
TW202031033A (zh) 設備狀態監控方法與系統
SE1500055A1 (sv) Method and data presenting device for facilitating work at an industrial site assisted by a remote user and a process control system
JP7726966B2 (ja) 画像処理システムおよび画像処理方法
CN103839508A (zh) 一种电子设备、图像显示方法及系统
SE1300676A1 (sv) Förfarande och datapresentationsarrangemang för att assistera en avlägsen användare att ge instruktioner till en lokalanvändare
US20250108511A1 (en) Methodology for safe remote humanoid takeover for multi-user telexistence with minimal hardware user requirements
JP7589970B2 (ja) 共有方法
SE1400157A1 (sv) Combining video with electronic notes
Vartiainen et al. Challenges of using mobile devices in process industry
JPWO2018078817A1 (ja) インターホンシステムおよび設定装置
CN118575471A (zh) 虚拟现实会议系统
KR20170143371A (ko) Hmi의 다중 디스플레이 설정 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13776995

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13776995

Country of ref document: EP

Kind code of ref document: A1