[go: up one dir, main page]

WO2017033605A1 - Dispositif de panneau tactile et système d'endoscope - Google Patents

Dispositif de panneau tactile et système d'endoscope Download PDF

Info

Publication number
WO2017033605A1
WO2017033605A1 PCT/JP2016/070652 JP2016070652W WO2017033605A1 WO 2017033605 A1 WO2017033605 A1 WO 2017033605A1 JP 2016070652 W JP2016070652 W JP 2016070652W WO 2017033605 A1 WO2017033605 A1 WO 2017033605A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch panel
button
operation button
range
panel device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/070652
Other languages
English (en)
Japanese (ja)
Inventor
剛 浦崎
亜紀 松元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to JP2016564109A priority Critical patent/JPWO2017033605A1/ja
Publication of WO2017033605A1 publication Critical patent/WO2017033605A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a touch panel device and an endoscope system having a reaction area that is larger than a display area of operation buttons.
  • an endoscope that images a subject inside a subject, a video processor that generates an observation image of the subject captured by the endoscope, and a monitor that displays an observation image generated by the video processor are provided.
  • Endoscope systems are widely used in the medical field, the industrial field, and the like.
  • a video processor of an endoscope system is provided with a touch panel for performing various settings and the like.
  • the user can change various settings by touching operation items (operation buttons) on the touch panel.
  • Japanese Unexamined Patent Application Publication No. 2009-37344 proposes a processor for an endoscope system that detects a pressing position pressed by a user and changes the range of the reaction region in accordance with the pressing position pressed by the user. ing.
  • the conventional touch panel detects how far away from the center of the reaction area and changes the reaction area in the direction away from it, and the user pressed outside the reaction area due to the small reaction area, etc. In this case, there is a problem that the operation is invalidated.
  • an object of the present invention is to provide a touch panel device and an endoscope system that can prevent an operation from being invalidated without reliably touching the inside of a reaction region.
  • a touch panel device of one embodiment of the present invention includes a display unit, a touch panel provided in the display unit, a detection unit that detects a touch operation on the touch panel, and an operation for operating the display unit by the touch operation.
  • a detection range setting unit that sets an expanded range that is expanded.
  • An endoscope system is operated by the touch operation on a display unit, a touch panel provided in the display unit, a detection unit that detects a touch operation on the touch panel, and the display unit.
  • a touch panel device including a detection range setting unit that sets an expanded range that is a predetermined range larger than a region is provided in a casing device to which an endoscope is detachably connected.
  • FIG. 4 is a diagram for explaining an example of an operation screen displayed on the touch panel 25.
  • FIG. It is a figure for demonstrating an example of the operation screen of memory information. It is a figure for demonstrating an example of the reaction area
  • FIG. 1 is a diagram illustrating a schematic configuration of the endoscope system according to the first embodiment
  • FIG. 2 is a diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment. .
  • an endoscope system 1 includes an endoscope 2 that captures an in-vivo image of a subject by inserting a distal end portion into a body cavity of a subject, and outputs an image signal of the subject image.
  • a video processor 3 that performs predetermined signal processing on the image signal output from the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1, and an image that has undergone signal processing in the video processor 3
  • a display device 4 for displaying and a keyboard 5 for inputting operation instructions and character information are provided. The display device 4 and the keyboard 5 are connected to the video processor 3 via a cable (not shown).
  • the video processor 3 is a light source-integrated video processor that generates illumination light to be emitted from the distal end of the endoscope 2 and supplies the illumination light to the endoscope 2.
  • a touch panel 25 is provided on the front surface of the video processor 3.
  • the video processor 3 may have a configuration in which a light source device that supplies illumination light to the endoscope 2 is provided separately.
  • the keyboard 5 is provided on a pedestal 6 that can be pulled out. When the user uses the keyboard 5, the user can pull out the pedestal 6 from the medical trolley to input characters, and when not using the keyboard 5, the user can store the pedestal 6 in the medical trolley.
  • peripheral devices such as a printer and a water feeding device may be mounted on the medical trolley.
  • the endoscope 2 is provided at the distal end of an insertion portion to be inserted into a body cavity of a patient, and an imaging element 11 such as a CCD for imaging a subject, and a light for guiding illumination light to the distal end of the insertion portion
  • an imaging element 11 such as a CCD for imaging a subject
  • a light for guiding illumination light to the distal end of the insertion portion
  • a guide 12 an operation switch 13 provided in an operation unit for operating the endoscope 2, and an electrical connector 14 a provided in a connector unit 14 for connecting to the video processor 3 are provided.
  • the insertion portion of the endoscope 2 may be flexible or rigid (a rigid endoscope used for surgery).
  • the image pick-up element 11 is provided in the insertion part front-end
  • the image sensor 11 is provided in an operation unit (portion gripped by the user) provided with the operation switch 13, and transmits an optical image using an image guide fiber from the distal end of the insertion unit to the image sensor 11 in the operation unit. It may be a configuration.
  • the video processor 3 is configured to be connected to the endoscope 2, but the present invention is not limited to this, and an optical endoscope (a fiberscope or a fiberscope) inserted into a body cavity is not limited thereto.
  • an optical endoscope a fiberscope or a fiberscope
  • a configuration may be adopted in which a camera head to be attached to an eyepiece of a surgical optical tube) is connected.
  • the endoscope 2 and the video processor 3 are connected by the electrical connector 14a and the connector 23 and are configured to transmit electrical signals by wire, but the present invention is not limited to this.
  • the configuration may be such that the signal is transmitted wirelessly.
  • the video processor 3 includes a light source 21 such as a lamp that generates illumination light, and a condenser lens 22 that condenses the illumination light from the light source 21 on the incident end face of the light guide 12.
  • a light source 21 such as a lamp that generates illumination light
  • a condenser lens 22 that condenses the illumination light from the light source 21 on the incident end face of the light guide 12.
  • the video processor 3 includes a connector 23 connected to the electrical connector 14 a of the endoscope 2, and a control unit 24 such as a CPU that drives and controls the imaging device 11 of the endoscope 2 via the connector 23.
  • a touch panel 25 for performing various operations and settings
  • a memory 26 for storing an endoscopic image and various information
  • video processing for performing predetermined signal processing on an imaging signal from the imaging device 11 via the connector 23
  • a display controller 28 that changes the character size of the character information, and a superimposing circuit 29 that superimposes the character information on the signal of the video processing unit 27.
  • a recording medium 30 is detachably configured in the video processor 3, and an endoscopic image and various types of information can be stored in the recording medium 30.
  • the light source 21 of the video processor 3 is comprised by the lamp etc., it is not limited to this, For example, you may be comprised by semiconductor light emitting elements (semiconductor light source), such as LED or a laser diode. Further, when using a semiconductor light source, a semiconductor light source that emits white light may be used. For example, a semiconductor light source is provided for each color component of R (red), G (green), and B (blue), The light of each color component emitted from these semiconductor light sources may be combined to obtain white light. Further, the semiconductor light source may be configured to be provided at the distal end of the insertion portion of the endoscope 2.
  • the video processing unit 27 performs predetermined video signal processing such as noise reduction processing, white balance processing, and color correction on the video signal from the image sensor 11, and outputs the obtained video signal to the superimposing circuit 29.
  • predetermined video signal processing such as noise reduction processing, white balance processing, and color correction
  • the display controller 28 outputs character information related to the examination displayed on the display device 4 to the superimposing circuit 29 based on the control of the control unit 24. At this time, the display controller 28 changes the size of the character information based on the control of the control unit 24 and outputs it to the superimposing circuit 29.
  • the operation switch 13 of the endoscope 2 When the user operates (presses) the operation switch 13 of the endoscope 2, an operation signal is supplied to the control unit 24. When this operation signal is input, the control unit 24 controls the display controller 28 to change the size of the character information.
  • the operation switch 13 constitutes an operation signal output unit that outputs an operation signal in accordance with a user operation.
  • the superimposing circuit 29 generates an endoscopic inspection image in which the character information from the display controller 28 is superimposed on the video signal (endoscopic image) from the video processing unit 27 and outputs the endoscopic inspection image to the display device 4. Thereby, an endoscopic examination image is displayed on the display device 4.
  • the touch panel 25 is, for example, a touch panel display in which a liquid crystal display (display unit 25a) and an electrostatic touch panel sensor arranged so as to overlap the liquid crystal display are integrated.
  • a touch panel display in which a liquid crystal display (display unit 25a) and an electrostatic touch panel sensor arranged so as to overlap the liquid crystal display are integrated.
  • various settings of the endoscope system 1 and an operation screen for browsing recorded images are displayed.
  • the user can view changes in settings and images of the endoscope system 1 by touching (pressing) the touch panel 25 with, for example, a finger.
  • An operation signal for operating the touch panel 25 is input to the control unit 24.
  • the control unit 24 performs various settings of the endoscope system 1 according to operation signals input from the touch panel 25. More specifically, the reaction area when the user presses the operation button on the touch panel 25 with a finger or the like and the reaction area when the finger or the like is released (released) from the operation button is the same as the reaction area of the operation button. In this case, the control unit 24 determines that the release operation (button operation) is valid, and executes the function of the pressed operation button.
  • control is performed.
  • the unit 24 determines that the release operation (button operation) is invalid and does not execute the function of the operation button.
  • the control unit 24 configures a detection unit that detects a touch operation on the touch panel 25.
  • control unit 24 as a display control unit performs overall control of the touch panel 25 such as displaying an operation screen (menu screen) including operation buttons to be operated by a touch operation on the touch panel 25.
  • control unit 24 and the touch panel 25 constitute a touch panel device.
  • the touch panel 25 is a capacitive touch panel, but is not limited to a capacitive touch panel, and may be another type.
  • FIG. 3 is a diagram for explaining an example of an operation screen displayed on the touch panel 25.
  • the operation screen displayed on the touch panel 25 displays a home button 40 for displaying a home screen, an image browsing button 41 for displaying an image browsing screen, and a setting change screen.
  • the setting button 42 is provided.
  • the operation screen shown in FIG. 3 is an image browsing screen displayed when the image browsing button 41 is pressed.
  • the internal memory button 43 is a button for displaying endoscopic image information recorded in the internal memory 26 of the video processor 3.
  • the information on the endoscopic image is information such as the date and time when the endoscopic image is acquired, the patient ID, the patient name, and the transfer state.
  • the portable memory button 44 is a button for displaying information of an endoscopic image recorded in an external memory that is detachable from the video processor 3, that is, the recording medium 30.
  • the information of the endoscopic image is information such as the date and time when the endoscopic image is acquired, the patient ID, the patient name, and the transfer state, as in the case of the internal memory button 43.
  • the information button 45 is a button for displaying memory information of the internal memory (memory 26) and the portable memory (recording medium 30). The user can display the memory information operation screen shown in FIG. 4 by pressing the information button 45.
  • FIG. 4 is a diagram for explaining an example of an operation screen for memory information.
  • the memory information includes information such as the number of examinations, the number of SD (Standard Definition) image recording, the number of HD (High Definition) image recording, the number of recordable images stored in the internal memory and the portable memory. It is.
  • the user can return to the image browsing screen of FIG. 3 by pressing the close button 46.
  • Such a touch panel 25 has a reaction area that recognizes that various operation buttons have been operated. This reaction region will be described with reference to FIG. FIG. 5 is a diagram for explaining an example of a reaction area of various operation buttons according to the first embodiment.
  • the response area of an operation button is set to approximately the same size as the display area of the operation button. That is, as shown in FIG. 5, the reaction area of the home button 40 is set to a reaction area 47 that is substantially the same size as the display area of the home button 40.
  • reaction area of the internal memory button 43 is preset by the control unit 24 in a reaction area 48 larger than the display area of the internal memory button 43.
  • reaction area of the portable memory button 44 and the information button 45 is preset by the control unit 24 in a reaction area 49 and a reaction area 50 that are larger than the display area of the portable memory button 44 and the information button 45, respectively.
  • control unit 24 as the detection range setting unit expands a reaction area (operation detection range) that recognizes that the operation button has been operated by a touch operation, including the operation button area and a predetermined range from the area. Set to extended range.
  • the operation buttons have a predetermined height or more to prevent erroneous operation.
  • the predetermined height of the reaction region is, for example, 10 mm or more in consideration of the size of a general human finger.
  • reaction region is previously set in the direction in which the other operation button is not arranged. It is expanding. Since the internal memory button 43, the portable memory button 44, and the information button 45 have a height lower than a predetermined height and no other operation buttons are arranged on the upper part (that is, the reaction area of the other operation buttons on the upper part). Reaction regions 48, 49, and 50 having an enlarged reaction region on the upper side are set in advance.
  • the reaction region 47 is not enlarged. Even when the height of the operation button is lower than the predetermined height, if another operation button is arranged adjacent to the operation button, expanding the reaction area overlaps the reaction area of the other operation button. Therefore, do not enlarge the reaction area.
  • the reaction area of the operation button is expanded in advance. Therefore, the inside of the reaction area cannot be reliably touched and the operation is prevented from being invalidated.
  • FIGS. 6 and 7 are diagrams for explaining another example of the reaction region enlarged in advance.
  • FIG. 6 shows a selection screen of a function to be assigned to a switch operated by an operator such as the operation switch 13, and a button 60 for selecting a lamp ON / OFF switching function is arranged as this assigned function.
  • a character string representing the explanation of the button 60 here, a character string of “lamp” is displayed side by side.
  • a reaction region 61 larger than the button 60 is set in advance. Specifically, the reaction area 61 is enlarged so as to react up to the first character of the button 60 and a character string representing the description of the button 60.
  • the switch information screen includes a button for assigning a white balance adjustment function to the operation switch, a button for assigning a function for brightening the image to the operation switch, and a function for darkening the image to the operation switch. These buttons are also arranged. Similarly, for these buttons, the reaction region is enlarged and set in advance.
  • reaction region enlarged in advance is not limited to FIG.
  • the reaction area 62 may be set in advance so as to react to all of the button 60, the button 60, and the character string representing the description of the button 60.
  • the reaction area 63 may be set in advance so as to react only to the character string representing the description of the button 60.
  • the user can perform an intuitive touch operation and improve operability. it can.
  • the touch panel 25 of the touch panel device has other operation buttons when the operation buttons (or reaction regions) are lower than a predetermined height and no other operation buttons are arranged adjacent to the operation buttons.
  • the reaction region is enlarged in advance in the direction in which is not arranged.
  • the touch panel 25 expands the reaction area in advance so as to include an operation button and a part or all of a character string related to the operation button. As a result, the touch panel device can prevent the user from pressing out of the reaction capacity range lower than the predetermined height or pressing out of the display area related to the operation button.
  • the touch panel device according to the present embodiment and the endoscope system including the touch panel device can prevent the operation from being invalidated without reliably touching the inside of the reaction region.
  • the touch panel device and the endoscope system in which the reaction area of the operation button is enlarged in advance have been described.
  • the touch panel apparatus in which the user enlarges the reaction area after pressing the operation button, and An endoscope system will be described. Note that the overall configuration of the endoscope system 1 is the same as that of the first embodiment, and control different from that of the first embodiment will be described below.
  • FIG. 8 is a diagram for explaining an example of a reaction region of various operation buttons according to the second embodiment
  • FIG. 9 is a diagram for explaining an example of a reaction region in which an enlargement process is executed. 8 and 9, the same components as those in FIG. 5 are denoted by the same reference numerals and description thereof is omitted.
  • the internal memory button 43 is set with a reaction area 80 having substantially the same size as the display area of the internal memory button 43.
  • the portable memory button 44 is set with a reaction area 81 having approximately the same size as the display area of the portable memory button 44, and the information button 45 has approximately the same size as the display area of the information button 45.
  • a reaction area 82 is set.
  • the touch panel 25 validates the operation when the reaction area when the user presses the operation button with a finger or the like and the reaction area when the user releases the operation button are the same reaction area. That is, after the user presses the reaction area 80 of the internal memory button 43 with a finger or the like and slides the finger or the like outside the reaction area 80 while pressing the button, the operation is invalidated.
  • the control unit 24 detects that the reaction region 80 is pressed from the operation signal (pressed coordinate signal).
  • the control unit 24 expands the reaction area 80 of the internal memory button 43 to the reaction area 84 as shown in FIG. In the example of FIG. 9, the control unit 24 sets the reaction region 80 to a reaction region 84 that is enlarged upward.
  • the control unit 24 validates the pressing of the internal memory button 43 when the coordinates where the release is detected are within the reaction region 84.
  • the control unit 24 performs a process of enlarging the reaction area only on the upper side, but is not limited thereto.
  • the reaction area of the pressed operation button may be enlarged so as to overlap the reaction area of another operation button.
  • the control unit 24 may change the reaction area 82 of the information button 45 shown in FIG. 8 to a reaction area 85 that is enlarged in the vertical and horizontal directions as shown in FIG.
  • the reaction area to be expanded is not limited to the reaction area of the operation button whose height is lower than the predetermined height as in the case of the internal memory button 43, and the control unit 24 allows the operation button having a predetermined height or higher to be selected.
  • the reaction area may be enlarged.
  • the control unit 24 may perform a process of expanding the reaction area 47 of the home button 40 in the left direction and the downward direction.
  • the control unit 24 presses the button 60 in FIG. You may perform the process which expands a reaction area
  • FIG. 10 is a flowchart for explaining an example of the flow of the reaction region enlargement process.
  • control unit 24 detects pressing of the operation button (step S1), and expands the reaction area of the operation button (step S2). Next, the control unit 24 detects the release of the operation button (step S3), and determines whether or not the coordinates where the release is detected are within the enlarged reaction region (step S4).
  • step S5 If it is determined that the coordinates where the release is detected are within the expanded reaction region, the determination is YES, and the control unit 24 determines that the release operation is valid and starts executing the button function (step S5). On the other hand, if it is determined that the coordinates where the release is detected are not within the enlarged reaction region, the determination is NO, and the control unit 24 determines that the release operation is invalid and does not execute the button function (step S6).
  • step S5 or step S6 the control unit 24 returns the enlarged response area of the operation button to the original size, and ends the process.
  • the touch panel device performs a process of enlarging the reaction area of the operation button more than the display area of the operation button.
  • the endoscope system including the touch panel and the touch panel device according to the present embodiment can operate even when the user presses the operation button and then slides a finger or the like outside the original reaction area while pressing the operation button. It can be prevented from being invalidated.
  • each step in the flowchart in the present specification may be executed in a different order for each execution by changing the execution order and performing a plurality of steps at the same time as long as it does not contradict its nature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de panneau tactile qui est pourvu d'une unité d'affichage, d'un panneau tactile qui est disposé sur l'unité d'affichage, et d'une unité de commande (24) qui détecte une opération tactile sur le panneau tactile (25). L'unité de commande (24) affiche, sur l'unité d'affichage, un écran de menu qui comprend un bouton de fonctionnement qui doit être actionné par une opération tactile, et règle la plage de détection d'opération, à l'intérieur de laquelle le bouton de fonctionnement est reconnu comme ayant été actionné par une opération tactile, à une plage étendue qui comprend la région du bouton de fonctionnement et se déploie sur ladite région selon une plage prescrite.
PCT/JP2016/070652 2015-08-26 2016-07-13 Dispositif de panneau tactile et système d'endoscope Ceased WO2017033605A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016564109A JPWO2017033605A1 (ja) 2015-08-26 2016-07-13 タッチパネル装置及び内視鏡システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-167083 2015-08-26
JP2015167083 2015-08-26

Publications (1)

Publication Number Publication Date
WO2017033605A1 true WO2017033605A1 (fr) 2017-03-02

Family

ID=58099833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/070652 Ceased WO2017033605A1 (fr) 2015-08-26 2016-07-13 Dispositif de panneau tactile et système d'endoscope

Country Status (2)

Country Link
JP (1) JPWO2017033605A1 (fr)
WO (1) WO2017033605A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257629A (ja) * 2007-04-09 2008-10-23 Matsushita Electric Ind Co Ltd タッチ式入力装置
JP2009037344A (ja) * 2007-07-31 2009-02-19 Hoya Corp タッチパネル及び内視鏡装置のプロセッサ
JP2012133718A (ja) * 2010-12-24 2012-07-12 Sharp Corp 入力表示装置および加熱調理器
JP2012247833A (ja) * 2011-05-25 2012-12-13 Pioneer Electronic Corp 情報処理装置及び方法、並びにコンピュータプログラム
JP2014016714A (ja) * 2012-07-06 2014-01-30 Sharp Corp 情報表示装置、情報表示方法、情報表示プログラム、および、プログラム記録媒体
JP2015064875A (ja) * 2013-08-30 2015-04-09 キヤノンマーケティングジャパン株式会社 情報処理装置、情報処理方法、プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4037378B2 (ja) * 2004-03-26 2008-01-23 シャープ株式会社 情報処理装置、画像出力装置、情報処理プログラムおよび記録媒体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257629A (ja) * 2007-04-09 2008-10-23 Matsushita Electric Ind Co Ltd タッチ式入力装置
JP2009037344A (ja) * 2007-07-31 2009-02-19 Hoya Corp タッチパネル及び内視鏡装置のプロセッサ
JP2012133718A (ja) * 2010-12-24 2012-07-12 Sharp Corp 入力表示装置および加熱調理器
JP2012247833A (ja) * 2011-05-25 2012-12-13 Pioneer Electronic Corp 情報処理装置及び方法、並びにコンピュータプログラム
JP2014016714A (ja) * 2012-07-06 2014-01-30 Sharp Corp 情報表示装置、情報表示方法、情報表示プログラム、および、プログラム記録媒体
JP2015064875A (ja) * 2013-08-30 2015-04-09 キヤノンマーケティングジャパン株式会社 情報処理装置、情報処理方法、プログラム

Also Published As

Publication number Publication date
JPWO2017033605A1 (ja) 2017-08-31

Similar Documents

Publication Publication Date Title
US10506913B2 (en) Apparatus operation device, apparatus operation method, and electronic apparatus system
JP6214825B2 (ja) 内視鏡プロセッサ
US11202010B2 (en) Control device, external device, medical observation system, control method, display method, and program
JP5933868B1 (ja) 制御装置および内視鏡システム
US20210375323A1 (en) Information processing apparatus and information processing method
JP4801641B2 (ja) タッチパネル及び内視鏡装置のプロセッサ
JP6062112B2 (ja) 内視鏡システム
JP2005329130A (ja) 電子内視鏡システム
US9851844B2 (en) Touch panel device
US20210278904A1 (en) Information processing device, information processing method, and program
JP4598178B2 (ja) 電子内視鏡用プロセッサ、及び電子内視鏡システム
JP2012090785A (ja) 電子内視鏡装置
WO2017033605A1 (fr) Dispositif de panneau tactile et système d'endoscope
WO2022113811A1 (fr) Système d'intervention chirurgicale, dispositif de commande d'intervention chirurgicale, procédé de commande et programme
US20220190524A1 (en) Cable and notification method
JP2017042417A (ja) 内視鏡システム
JP6765787B2 (ja) 内視鏡システム及び内視鏡システムにおける内視鏡画像とカメラ画像との切替制御方法
WO2017187835A1 (fr) Dispositif de boîtier
JP6291355B2 (ja) 電子内視鏡システム及びこれに用いる頭部ジェスチャ検出型遠隔操作装置
JP2018019767A (ja) 内視鏡装置
JP2017080079A (ja) 内視鏡プロセッサ、並びに、内視鏡プロセッサの信号処理方法及び制御プログラム
WO2017002412A1 (fr) Système endoscopique
JP2016136197A (ja) 内視鏡

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016564109

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16838952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16838952

Country of ref document: EP

Kind code of ref document: A1