WO2016017879A1 - Dispositif intelligent et son procédé de commande - Google Patents
Dispositif intelligent et son procédé de commande Download PDFInfo
- Publication number
- WO2016017879A1 WO2016017879A1 PCT/KR2015/000330 KR2015000330W WO2016017879A1 WO 2016017879 A1 WO2016017879 A1 WO 2016017879A1 KR 2015000330 W KR2015000330 W KR 2015000330W WO 2016017879 A1 WO2016017879 A1 WO 2016017879A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lines
- signals
- driving
- sensing
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- Embodiments of the present invention relate to a smart device and a method of controlling the same. More particularly, the present invention relates to a smart device including a touch panel and a display panel and a method of controlling the same.
- the touch panel may be used as an input device of a smart device including a display panel and an application processor.
- the touch panel may be classified into a resistive type, a capacitive type, an electro magnetic type, and the like.
- a mutual capacitance type touch panel may detect a touch position by measuring a mutual capacitance changed by a touch of a conductor between a driving line and a sensing line.
- the mutual capacitance touch panel may include a plurality of driving lines and a plurality of sensing lines crossing the driving lines. When driving signals are sequentially applied to the driving lines, the mutual capacitance generated by the application of the driving signals from the sensing lines may be detected.
- the driving lines and the sensing lines may be connected to a touch integrated circuit, and the touch integrated circuit may apply the driving signals and obtain touch signals from the sensing lines.
- a standby mode for turning off the display panel and the touch panel when an application program is not operated or when a sound source playback program such as a music player is operated may be used to reduce power consumption of the smart device.
- the standby mode may be released by inputting a password or a preset touch pattern.
- Korean Patent Laid-Open Publication No. 10-2009-0058875 discloses a method of releasing a standby mode by inputting a preset touch pattern and thereby turning on the display panel.
- the operation of the display panel is unnecessary, for example, when using a sound source playback program such as a music player or when selecting a ringing mode or a vibration mode, the display panel is turned on unnecessarily, thus increasing power consumption.
- the touch panel must be maintained in an operating state in order to detect the touch pattern.
- Embodiments of the present invention for solving the above problems are to provide a smart device and a method for controlling the same that can reduce power consumption.
- a smart device for achieving the above object, a smart device, a touch panel including a plurality of driving lines and a plurality of sensing lines intersecting the drive lines, and a display panel coupled to the touch panel And a touch integrated circuit configured to provide driving signals to the driving lines and to receive sensing signals from odd or even sensing lines among the sensing lines, and an application processor for operating an application program according to the sensing signals. It may include.
- the touch integrated circuit may provide driving signals to the driving lines, obtain first scan data from the odd-numbered sensing lines, and transmit driving signals to the driving lines. And second scan data from the even-numbered sensing lines.
- a gesture when a gesture is input to the touch panel in the off state of the display panel, an event corresponding to the input gesture is executed, and the display panel is turned off while the event is executed. Can be maintained.
- the touch integrated circuit may include a touch driver configured to provide the driving signals to the driving lines, receive the sensing signals from the sensing lines, and convert the sensing signals into digital signals. And a controller configured to transmit a command corresponding to the event to the application processor according to the digital signals.
- the controller compares the input gesture with preset gestures, selects a gesture corresponding to the input gesture from among the preset gestures, and executes the event.
- the command corresponding to the selected gesture may be transmitted to the application processor.
- the touch panel may be divided into a plurality of regions, and the touch integrated circuit may provide the driving signals to the driving lines passing through any one of the regions.
- a smart device for achieving the above object, a smart device, a touch panel including a plurality of driving lines and a plurality of sensing lines intersecting the drive lines, and a display panel coupled to the touch panel And dividing the touch panel into a plurality of regions, selecting any one of the regions, providing driving signals to driving lines passing through the selected region, and sensing lines passing through the selected region.
- a touch integrated circuit may be configured to receive sensing signals and an application processor to operate an application program according to the sensing signals.
- a method for controlling a smart device inputting a gesture on the touch panel, detecting the gesture, and executing an event corresponding to the gesture It may include the step.
- the gesture may be applied by driving signals to the driving lines of the touch panel and detected by sensing signals received from odd or even sensing lines of the touch panel.
- the detecting of the gesture may include providing driving signals to the driving lines and acquiring first scan data from the odd-numbered sensing lines, and driving the driving lines. Providing signals and obtaining second scan data from the even-numbered sensing lines.
- the touch panel may be divided into a plurality of regions, and the driving signals may be provided to driving lines passing through any one of the regions.
- the gesture input step and the event execution step may be performed while the display panel of the smart device is turned off.
- operation control of an application program or an environment setting program of the smart device may be performed when the display panel is turned off.
- the application program may include a sound source playback program, a flash on / off program, a music search program, a weather search program, a recording program, a radio, a voice navigation, and the like.
- the environment setting program may include a telephone call, a WiFi mode, a Bluetooth mode, a short range wireless communication mode, a ringtone mode, and a vibration mode.
- the touch panel may be divided into a plurality of areas, and the application program and the environment setting program may be allocated to the areas, respectively.
- the method may further include comparing the input gesture with preset gestures, selecting a gesture corresponding to the input gesture among the preset gestures, and executing the event.
- the transmitting of the command corresponding to the selected gesture to the application processor of the smart device may be further performed.
- the driving signals may be sequentially provided to the driving lines.
- a predetermined number of driving signals may be simultaneously provided to the driving lines.
- the gesture may be input while the conductor is in contact with the touch panel or the conductor is spaced apart from the touch panel by a predetermined distance.
- a gesture may be input to the touch panel while the display panel is turned off, and the touch integrated circuit may execute the event corresponding to the input gesture.
- the command corresponding to the event may be transmitted to the application processor.
- the display panel may remain off while the event is executed by the application processor. Therefore, power consumption of the smart device can be sufficiently reduced.
- the touch panel may be operated in a partial scan mode while the display panel is kept off.
- driving signals may be selectively provided to odd and / or even driving lines, and sensing signals may be obtained from odd or even sensing lines. .
- a partial area of the touch panel may be used as the touch area, and thus the number of driving signals and / or the number of sensing signals used to obtain scan data constituting each frame may be reduced. As a result, power consumption of the touch panel may be greatly reduced as compared with the case in which the touch panel is operated in the full scan mode.
- FIG. 1 is a block diagram illustrating a smart device according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating a touch panel and a touch integrated circuit shown in FIG. 1.
- 3 and 4 are schematic diagrams for describing a partial scan mode of a touch panel.
- FIG. 5 is a schematic diagram illustrating a full scan mode of a touch panel.
- 6 to 9 are schematic diagrams for describing another example of the partial scan mode.
- 10 and 11 are schematic diagrams for describing another example of the partial scan mode.
- 12 and 13 are schematic diagrams for describing still other examples of the partial scan mode.
- 14 to 26 are schematic diagrams for describing events operable by a gesture input.
- 27 is a schematic diagram for describing programs allocated to touch areas.
- FIG. 28 is a flowchart illustrating a control method of a smart device according to an embodiment of the present invention.
- the element When an element is described as being disposed or connected on another element or layer, the element may be placed or connected directly on the other element, and other elements or layers may be placed therebetween. It may be. Alternatively, where one element is described as being directly disposed or connected on another element, there may be no other element between them. Terms such as first, second, third, etc. may be used to describe various items such as various elements, compositions, regions, layers and / or parts, but the items are not limited by these terms. Will not.
- Embodiments of the invention are described with reference to schematic illustrations of ideal embodiments of the invention. Accordingly, changes from the shapes of the illustrations, such as changes in manufacturing methods and / or tolerances, are those that can be expected sufficiently. Accordingly, embodiments of the invention are not to be described as limited to the particular shapes of the areas described as the illustrations, but include variations in the shapes, and the areas described in the figures are entirely schematic and their shapes. Is not intended to describe the precise shape of the region nor is it intended to limit the scope of the invention.
- FIG. 1 is a block diagram illustrating a smart device according to an embodiment of the present invention.
- a smart device 100 may include a touch panel 110, a display panel 120 coupled to the touch panel 110, and a touch panel 110. And a touch integrated circuit 130 for sensing electrical signals, an application processor 140 for operating an application program according to the electrical signals, and the like.
- the touch panel 110 may be disposed above or below the display panel 120.
- a mutual capacitance type touch panel may be used.
- FIG. 2 is a schematic diagram illustrating a touch panel and a touch integrated circuit shown in FIG. 1.
- the touch panel 110 may include a plurality of driving lines 112 and sensing lines 114 that vertically cross the driving lines 112.
- the sensing lines 114 may be disposed on the driving lines 112, and an insulating layer (not shown) may be disposed between the driving lines 112 and the sensing lines 114. .
- the touch integrated circuit 130 may be generated from a touch driver 132 for providing driving signals to the driving lines 112 of the touch panel 110 and the sensing lines 114 of the touch panel 110.
- a signal processor 134 for receiving electrical signals, i.e., sensing signals, and converting the electrical signals into digital signals, and a controller 136 for transmitting a command to an application processor 140 according to the digital signals. can do.
- the smart device 100 may execute a low power mode to reduce power consumption, and the display panel 120 may be turned off in the low power mode.
- the low power mode may be executed through a power button (not shown) of the smart device 100.
- the low power mode is automatically executed. It may be.
- the low power mode may be executed while a sound source playing program such as a music player is executed.
- the smart device 100 may execute an event corresponding to the input gesture.
- the touch panel 110 may be switched to the partial scan mode.
- the low power mode of the display panel 120 and the partial scan mode of the touch panel 110 are continuously maintained in order to reduce power consumption.
- FIG. 3 and 4 are schematic diagrams for describing the partial scan mode of the touch panel
- FIG. 5 is a schematic diagram for explaining the full scan mode of the touch panel.
- the partial scan mode of the touch panel 110 may be executed while the display panel 120 is operated in the low power mode. That is, the partial scan mode of the touch panel 110 may be used to detect the gesture while the display panel 120 is kept off.
- driving signals may be provided to the driving lines 112 to detect the gesture while the partial scan mode is executed, and an odd or even number of the sensing lines 114 may be provided. Sensing signals corresponding to the driving signals may be received from the first sensing lines 114A or 114B.
- the first scan data constituting the first frame sequentially provides driving signals to the driving lines 112 and senses received from the odd-numbered sensing lines 114A as shown in FIG. 3. May include signals. That is, as shown in FIG. 3, the first scan data is the sensing signals S1, S3, S5, and S7 obtained from the odd-numbered sensing lines 114A after providing the first driving signal D1. After the second driving signal D2 is provided, the sensing signals S1, S3, S5, and S7 and the third driving signal D3 obtained from the odd-numbered sensing lines 114A are provided.
- the sensing signals S1, S3, S5, and S7 and the fourth driving signal D4 are provided from the odd numbered sensing lines 114A
- the sensing signals obtained from the odd numbered sensing lines 114A are provided.
- the seventh driving signal (D7) obtained from the odd-numbered sensing lines (114A) and the odd-numbered sensing lines ( Stroke from 114A)
- the sensing signals S1, S3, S5, and S7 and the eighth driving signal D8 are provided, the sensing signals S1, S3, S5, and S7 obtained from the odd-numbered sensing lines 114A. ) May be included.
- the second scan data constituting the second frame sequentially provides driving signals to the driving lines 112 and senses signals received from the even-numbered sensing lines 114B. It may include. That is, as shown in FIG. 4, the second scan data is the sensing signals S2, S4, S6, and S8 obtained from the even-numbered sensing lines 114B after providing the first driving signal D1. After the second driving signal D2 is provided, the sensing signals S2, S4, S6, and S8 and the third driving signal D3 obtained from the even-numbered sensing lines 114B are provided. After the sensing signals S2, S4, S6, and S8 and the fourth driving signal D4 are provided from the even-numbered sensing lines 114B, the sensing signals obtained from the even-numbered sensing lines 114B are provided.
- the controller 136 may repeatedly acquire the first scan data and the second scan data to detect the gesture.
- each of the frames is configured by sensing signals received from the odd or even sensing lines 114A or 114B after the driving signals are provided to the driving lines 112. ) Can be greatly reduced. In particular, even when the event is executed, since the display panel 120 may be maintained in an off state, power consumption of the smart device 100 may be sufficiently reduced.
- the operation mode of the touch panel 110 may be switched from the partial scan mode to the full scan mode.
- the full scan mode may be used to improve linearity and accuracy of the touch panel 110.
- driving signals may be sequentially provided to the driving lines 112, and scan data may be obtained using electrical signals detected from the sensing lines 114. Can be. That is, the driving signals may be sequentially provided to all the driving lines 112 in every frame.
- the touch panel 110 when the touch signal is detected by the controller 136, the touch panel 110 may be switched from the partial scan mode to the full scan mode in order to accurately detect the gesture. However, even in this case, the display panel 120 may be maintained in an off state, and when the touch signal is not detected for a preset time, the touch panel 110 may be switched to the partial scan mode again.
- a predetermined number of driving signals may be simultaneously provided to the driving lines 112.
- a plurality of driving signals D1, D2, D3, and D4 may be simultaneously provided to some of the driving lines 112, and then the plurality of driving signals D5, D6, D7, and D8 may be provided to the remaining drive lines 112 at the same time.
- the gesture may be input in a state in which a conductor, for example, a user's hand is in contact with the touch panel 110, or a state in which the conductor is spaced apart from the touch panel 110 by a predetermined distance. That is, the gesture may be input by dragging the conductor in contact or non-contact state by the conductor.
- the distance between the conductor and the touch panel 110 may be several mm to several cm. For example, a distance between the conductor and the touch panel 110 may be about 5 mm to 5 cm.
- 6 to 9 are schematic diagrams for describing another example of the partial scan mode.
- the touch driver 132 selectively provides driving signals to the odd or even driving lines 112A or 112B while the touch panel 110 is operated in the partial scan mode.
- the signal processor 134 may selectively receive sensing signals from odd or even sensing lines 114A or 114B.
- the first scan data constituting the first frame sequentially provides driving signals to the odd-numbered driving lines 112A and is received from the odd-numbered sensing lines 114A, as shown in FIG. 6.
- the sensing signals may be included. That is, as shown in FIG. 6, the first scan data includes sensing signals obtained from the odd-numbered sensing lines 114A and the third driving signal D3 after providing the first driving signal D1. Sensing signals obtained from the odd-numbered sensing lines 114A after providing a fifth driving signal D5 and sensing signals obtained from the odd-numbered sensing lines 114A after providing a fifth driving signal D5 After providing the signal D7, the detection signal may include sensing signals obtained from the odd-numbered sensing lines 114A.
- the second scan data constituting the second frame sequentially provides driving signals to the odd-numbered driving lines 112A and senses the sensing signals received from the even-numbered sensing lines 114B. It may include. That is, as shown in FIG. 7, the first scan data includes sensing signals obtained from the even-numbered sensing lines 114B and the third driving signal D3 after providing the first driving signal D1. Sensing signals obtained from the even-numbered sensing lines 114B after providing a fifth driving signal, and sensing signals obtained from the even-numbered sensing lines 114B after a fifth driving signal D5 is provided. After providing the signal D7, the sensing signals may be obtained from the even-numbered sensing lines 114B.
- the third scan data constituting the third frame sequentially supplies driving signals to the even-numbered driving lines 112B and senses the sensing signals received from the odd-numbered sensing lines 114A. It may include. That is, as shown in FIG. 8, the third scan data includes sensing signals acquired from the odd-numbered sensing lines 114A and the fourth driving signal D4 after providing the second driving signal D2. Sensing signals acquired from the odd-numbered sensing lines 114A after providing the sixth driving signal and sensing signals acquired from the odd-numbered sensing lines 114A after providing the sixth driving signal D6 After providing the signal D8, the sensing signal may include sensing signals obtained from the odd-numbered sensing lines 114A.
- the fourth scan data constituting the fourth frame sequentially supplies driving signals to the even-numbered driving lines 112B and senses the sensing signals received from the even-numbered sensing lines 114B. It may include. That is, as shown in FIG. 9, the fourth scan data includes sensing signals obtained from the even-numbered sensing lines 114B and the fourth driving signal D4 after providing the second driving signal D2. Sensing signals obtained from the even-numbered sensing lines 114B after supplying?, Sensing signals obtained from the even-numbered sensing lines 114B and eighth driving after providing a sixth driving signal D6 After providing the signal D8, the sensing signal may include sensing signals obtained from the even-numbered sensing lines 114B.
- the controller 136 may repeatedly acquire the first to fourth scan data in order to recognize the gesture. At this time, the order of acquiring the first to fourth scan data may be changed, whereby the scope of the present invention is not limited.
- the respective frames are sensing signals selectively received from the odd or even sensing lines 114A or 114B after driving signals are provided to the odd or even driving lines 112A or 112B. Since the power consumption of the touch panel 110 can be greatly reduced.
- 10 and 11 are schematic diagrams for describing another example of the partial scan mode.
- the controller 136 may divide the touch panel 110 into a plurality of areas, and select one of the areas.
- the selected area may be used as a touch area TA, and the driving signals may be selectively provided to the touch area TA.
- driving signals may be provided to the driving lines 112 disposed in the touch area TA, and odd and even sensing lines passing through the touch area TA may be provided.
- First scan data and second scan data may be obtained from 114A and 114B.
- first scan data in the touch area TA may be obtained from odd-numbered sensing lines 114A after providing driving signals D1 and D3 to odd-numbered driving lines.
- the second scan data may be obtained from the even-numbered sensing lines 114B after providing the driving signals D1 and D3 to the odd-numbered driving lines.
- third scan data may be obtained from odd-numbered sensing lines 114A after providing driving signals D2 and D4 to even-numbered driving lines.
- Data may be obtained from the even-numbered sensing lines 114B after providing the driving signals D2 and D4 to the even-numbered driving lines.
- 12 and 13 are schematic diagrams for describing still other examples of the partial scan mode.
- the controller 136 may divide the touch panel 110 into a plurality of areas, and select one of the areas.
- the selected area may be used as the touch area TA, and the driving signals may be selectively provided to the touch area TA.
- driving signals may be sequentially provided to the driving lines 112 passing through the touch area TA, and the sensing signals may be sensed from the sensing lines 114 passing through the touch area TA. Signals can be obtained.
- the number of driving signals and / or the sensing signals used to obtain scan data constituting each frame may be reduced, power consumption of the touch panel 110 may be greatly reduced. That is, the power consumption of the touch panel 110 can be greatly reduced by selectively using a partial region of the touch panel 110.
- the event may include functions of an application program operated in an off state of the display panel 120 or an environment setting program of the smart device 100.
- the application program may include a sound source playback program, a flash on / off program, a music search program, a weather search program, a recording program, a radio, a voice navigation, and the like. It may include a WiFi mode, a Bluetooth mode, a near field communication mode, a ringing mode, a vibration mode, and the like.
- the sound source reproduction program may be executed when the display panel 120 is turned off, and the flash on / off program may be used to operate a flash when the display panel 120 is turned off in a dark environment.
- 14 to 26 are schematic diagrams for describing events operable by a gesture input.
- the controller 136 may set various gestures and commands corresponding to the gestures, and the application processor 140 may execute events corresponding to the commands.
- the sound source playback program may play, stop, volume up / down, and the like, and set random play.
- a vibration mode a ringtone mode, and a Bluetooth on / off may be set in an environment setting program.
- a phone call function may be performed by an input of a gesture.
- the phone number corresponding to the dialing gesture may be set in advance.
- a music search program may be operated by an input of a gesture. Specifically, the music input through the microphone of the smart device may be searched and the search result may be output through the speaker.
- a weather search program may be operated by an input of a gesture.
- the weather information may be searched through the input of the gesture, and the search result may be output through the speaker.
- a recording program may be operated by an input of a gesture.
- a radio may be operated by an input of a gesture.
- voice navigation may be operated by an input of a gesture.
- 27 is a schematic diagram for describing programs allocated to touch areas.
- the controller 136 may divide the touch panel 110 into a plurality of areas, and the areas may be used as the touch areas TA1 and TA2, respectively.
- the touch panel 110 may be divided into a first touch area TA1 and a second touch area TA2, and the application program and the environment setting program may be divided into the first and second touch panels.
- Each of the two touch areas TA1 and TA2 may be allocated.
- the signal processor 134 may receive electrical signals generated by the input gesture, convert the electrical signals into digital signals, and transmit the digital signals to the controller 136.
- the controller 136 may transmit a command corresponding to the event to the application processor 140 according to the digital signals.
- the controller 136 may compare the input gesture with preset gestures, select a gesture corresponding to the input gesture from among the preset gestures, and correspond to the selected gesture to execute the event.
- the command may be transmitted to the application processor 140.
- FIG. 28 is a flowchart illustrating a control method of a smart device according to an embodiment of the present invention.
- a gesture may be input on the touch panel 110.
- the touch panel 110 may be operated in a partial scan mode, and the display panel 120 may be maintained in an off state.
- the input gesture may be detected.
- the driving driver 132 may provide driving signals to the driving lines 112 and the signal processing unit 134 may receive sensing signals from the odd or even sensing lines 114A or 114B. can do.
- the signal processor 134 configures a first sensing signal constituting a first frame from the odd-numbered sensing lines 114A after driving signals are applied to the driving lines 112 as shown in FIG. 3. And receiving second sensing signals constituting a second frame from the even-numbered sensing lines 114B after driving signals are applied to the driving lines 112 as shown in FIG. 4. can do.
- the signal processor 134 may convert the first and second sensing signals into first and second digital signals, and transmit the first and second digital signals to the controller 136.
- the controller 136 may repeatedly acquire the first and second digital signals, and thereby detect the gesture.
- the driving driver 132 may provide driving signals to the odd-numbered or even-numbered driving lines 112A or 112B, and the signal processor 134 may transmit the odd-numbered or even-numbered sensing lines 114A or the like. Sensing signals may be received from 114B).
- the signal processor 134 configures a first frame from the odd-numbered sensing lines 114A after driving signals are applied to the odd-numbered drive lines 112A, as shown in FIG. 6.
- the second sensing unit may receive the sensing signals, and configures a second frame from the even-numbered sensing lines 114B after driving signals are applied to the odd-numbered driving lines 112A as shown in FIG. 7. May receive signals.
- the signal processor 134 configures a third frame from the odd-numbered sensing lines 114A after driving signals are applied to the even-numbered driving lines 112B.
- the sensing signal may be received, and as shown in FIG. 9, after the driving signals are applied to the even-numbered driving lines 112B, a fourth sensing component constituting a fourth frame from the even-numbered sensing lines 114B is provided. May receive signals.
- the signal processor 134 may convert the first to fourth sensing signals into first to fourth digital signals and transmit the first to fourth digital signals to the controller 136.
- the controller 136 may repeatedly acquire the first to fourth digital signals, and thereby detect the gesture.
- a partial area of the touch panel 110 may be used as the touch area TA, and a user may input the gesture in the touch area TA.
- the touch integrated circuit 130 may detect the gesture by providing driving signals to the touch area TA and receiving sensing signals from the touch area TA.
- the input gesture may be compared with preset gestures.
- a gesture corresponding to the input gesture may be selected from the preset gestures.
- the command corresponding to the event may be transmitted to the application processor 140 to execute the event corresponding to the selected gesture.
- the steps S120, S130, and S140 may be performed by the controller 136.
- the application processor 140 may execute the event corresponding to the transmitted command.
- the display panel 120 may be maintained in an off state while the steps S100 to S150 are performed.
- a gesture may be input to the touch panel 110 while the display panel 120 is off, and the touch integrated circuit 130 may input the gesture.
- the command corresponding to the event may be transmitted to the application processor 140 to execute the event corresponding to the event.
- the display panel 120 may remain off while the event is executed by the application processor 140. Therefore, power consumption of the smart device 100 may be sufficiently reduced.
- the touch panel 110 may be operated in a partial scan mode while the display panel 120 is maintained in an off state.
- drive signals may be selectively provided to odd-numbered and / or even-numbered drive lines 112A and / or 114B to obtain scan data constituting one frame, and sensing signals may be odd-numbered or even-numbered sensing. May be obtained from lines 114A or 114B.
- a portion of the touch panel 110 may be used as the touch area TA, and thus, the number of driving signals and / or the number of sensing signals used to obtain scan data constituting each frame. Can be reduced.
- the power consumption of the touch panel 110 may be significantly reduced compared to the case in which the touch panel 110 is operated in the full scan mode. Therefore, the smart device and the method for controlling the same according to the embodiments of the present invention can be preferably used in the manufacture of the smart device.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Position Input By Displaying (AREA)
Abstract
La présente invention concerne un dispositif intelligent et son procédé de commande. Le dispositif intelligent comprend : un panneau tactile comprenant de multiples lignes d'attaque et de multiples lignes de détection croisant les lignes d'attaque ; un panneau d'affichage couplé au panneau tactile ; un circuit intégré tactile qui fournit des signaux d'attaque aux lignes d'attaque et reçoit des signaux de détection en provenance de lignes de détection de nombre pair ou de nombre impair parmi les lignes de détection ; et un processeur d'application destiné à faire fonctionner un programme d'application en fonction des signaux de détection.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2014-0097274 | 2014-07-30 | ||
| KR1020140097274A KR20160014983A (ko) | 2014-07-30 | 2014-07-30 | 스마트 기기 및 이를 제어하는 방법 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016017879A1 true WO2016017879A1 (fr) | 2016-02-04 |
Family
ID=55217749
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2015/000330 Ceased WO2016017879A1 (fr) | 2014-07-30 | 2015-01-13 | Dispositif intelligent et son procédé de commande |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20160014983A (fr) |
| WO (1) | WO2016017879A1 (fr) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112181265B (zh) * | 2019-07-04 | 2022-04-15 | 北京小米移动软件有限公司 | 一种触控信号处理方法、装置及介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100099394A1 (en) * | 2008-10-17 | 2010-04-22 | Sony Ericsson Mobile Communications Ab | Method of unlocking a mobile electronic device |
| KR20130096919A (ko) * | 2012-02-23 | 2013-09-02 | 주식회사 팬택 | 터치 기반으로 동작되는 휴대 단말 및 그 동작 방법 |
| US20130298073A1 (en) * | 2007-01-19 | 2013-11-07 | Lg Electronics Inc. | Electronic device and control method thereof |
| WO2013183925A1 (fr) * | 2012-06-04 | 2013-12-12 | 크루셜텍 주식회사 | Procédé et appareil de détection de toucher |
| WO2014014218A1 (fr) * | 2012-07-20 | 2014-01-23 | (주)실리콘화일 | Procédé de balayage subdivisé d'un panneau tactile |
-
2014
- 2014-07-30 KR KR1020140097274A patent/KR20160014983A/ko not_active Ceased
-
2015
- 2015-01-13 WO PCT/KR2015/000330 patent/WO2016017879A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130298073A1 (en) * | 2007-01-19 | 2013-11-07 | Lg Electronics Inc. | Electronic device and control method thereof |
| US20100099394A1 (en) * | 2008-10-17 | 2010-04-22 | Sony Ericsson Mobile Communications Ab | Method of unlocking a mobile electronic device |
| KR20130096919A (ko) * | 2012-02-23 | 2013-09-02 | 주식회사 팬택 | 터치 기반으로 동작되는 휴대 단말 및 그 동작 방법 |
| WO2013183925A1 (fr) * | 2012-06-04 | 2013-12-12 | 크루셜텍 주식회사 | Procédé et appareil de détection de toucher |
| WO2014014218A1 (fr) * | 2012-07-20 | 2014-01-23 | (주)실리콘화일 | Procédé de balayage subdivisé d'un panneau tactile |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20160014983A (ko) | 2016-02-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017111358A1 (fr) | Dispositif de terminal d'utilisateur et procédé de conversion de mode ainsi que système sonore permettant de régler le volume de haut-parleur de ce dernier | |
| WO2021075786A1 (fr) | Dispositif électronique et procédé de traitement d'une fenêtre surgissante utilisant une multi-fenêtre de celui-ci | |
| WO2011046345A2 (fr) | Procédé de commande de dispositif portable, dispositif d'affichage et système vidéo | |
| WO2013073908A1 (fr) | Appareil doté d'un écran tactile pour précharger plusieurs applications et procédé de commande de cet appareil | |
| WO2014175660A1 (fr) | Procédé de commande d'écran et son dispositif électronique | |
| WO2014077460A1 (fr) | Dispositif d'affichage et son procédé de commande | |
| WO2014104472A1 (fr) | Procédé et appareil de double dispositif d'affichage | |
| WO2019088793A1 (fr) | Dispositif électronique et procédé de partage d'écran utilisant ledit dispositif | |
| WO2017131335A1 (fr) | Dispositif de terminal utilisateur et son procédé de commande | |
| WO2015126121A1 (fr) | Procédé pour commander un appareil selon des informations de requête, et appareil prenant en charge le procédé | |
| WO2013147450A1 (fr) | Dispositifs et procédés de déverrouillage d'un mode de verrouillage | |
| WO2020209506A1 (fr) | Dispositif électronique comprenant un capteur tactile et procédé de commande de capteur tactile inclus dans un dispositif électronique | |
| EP3192218A1 (fr) | Terminal de l'internet des objets, et procédé de fonctionnement correspondant | |
| WO2020159213A1 (fr) | Procédé et dispositif de configuration contextuelle personnalisée par l'utilisateur | |
| WO2015108223A1 (fr) | Appareil et procédé pour un dispositif numérique fournissant un menu de commande rapide | |
| WO2015167072A1 (fr) | Dispositif numérique fournissant un rejet tactile et procédé de commande pour celui-ci | |
| WO2022098190A1 (fr) | Dispositif électronique et procédé de configuration de mode de sortie audio en fonction d'une connexion à une pluralité de dispositifs de sortie audio | |
| WO2012115296A1 (fr) | Terminal mobile et son procédé de commande | |
| WO2015046683A1 (fr) | Dispositif numérique et procédé de commande associé | |
| WO2013141598A1 (fr) | Terminal d'utilisateur, dispositif électronique et procédé de commande associé | |
| WO2016085232A1 (fr) | Dispositif de commande de terminal et procédé de commande l'utilisant | |
| WO2016095362A1 (fr) | Terminal d'affichage et procédé d'affichage pour l'écran de ce dernier | |
| WO2016088922A1 (fr) | Procédé de fourniture d'interface faisant appel à un dispositif mobile et dispositif portable | |
| WO2015093891A1 (fr) | Procédé de réduction de bruit dans un environnement de sortie audio et dispositif électronique le prenant en charge | |
| WO2016017879A1 (fr) | Dispositif intelligent et son procédé de commande |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15828155 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15828155 Country of ref document: EP Kind code of ref document: A1 |