[go: up one dir, main page]

WO2022071686A1 - Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge - Google Patents

Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge Download PDF

Info

Publication number
WO2022071686A1
WO2022071686A1 PCT/KR2021/012865 KR2021012865W WO2022071686A1 WO 2022071686 A1 WO2022071686 A1 WO 2022071686A1 KR 2021012865 W KR2021012865 W KR 2021012865W WO 2022071686 A1 WO2022071686 A1 WO 2022071686A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display
application
input
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2021/012865
Other languages
English (en)
Korean (ko)
Inventor
진준호
강보순
이다현
이정원
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2022071686A1 publication Critical patent/WO2022071686A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • Various embodiments disclosed in this document relate to a method of displaying a user interface and an electronic device supporting the same.
  • the electronic device may display various types of user interfaces (UIs) using the display. For example, in order to provide various functions, the electronic device may display at least one icon for executing the functions in various forms on the display. As an example, the user may cause the electronic device to display a user interface having a specified structure and shape on the display by performing a shaking gesture of the electronic device while holding the electronic device.
  • UIs user interfaces
  • a foldable electronic device including a flexible display and in which at least a part of the flexible display is foldable may display a user interface having various structures and shapes based on the folding angle.
  • the electronic device may include an input device (eg, a stylus pen, a digital pen, or a digitizer pen) that allows a user to perform a more delicate touch input.
  • an input device eg, a stylus pen, a digital pen, or a digitizer pen
  • the electronic device may include a housing and provide a detachable stylus pen inside the housing. A user may be provided with an intuitive user experience through a touch input using a stylus pen.
  • the electronic device may detect a gesture designated by the user (eg, a shaking gesture) and provide a user interface of various structures and forms on the display.
  • a gesture designated by the user eg, a shaking gesture
  • the user cannot check the display of the electronic device while performing the specified gesture.
  • the electronic device has limited functions that can be provided based on a gesture operation using an external input device.
  • a conventional electronic device provides a function of changing information related to sound output (eg, adjusting volume and/or starting and stopping of sound output) in response to a gesture designated by a user or changing a camera angle of view could.
  • An electronic device includes a flexible display, a wireless communication circuit, a processor, a stylus pen that is detachably inserted into at least one region inside the housing and communicates with the wireless communication circuit, and the processor It may include a memory operatively coupled to.
  • the processor receives a first signal including motion information of the stylus pen from the stylus pen, and based on the motion information, the flexible of the electronic device operates in a first form
  • a first user interface including a plurality of icons is displayed on a display, and when a touch input to at least one of the plurality of icons is sensed, at least one application corresponding to the at least one icon in which the touch input is sensed is displayed.
  • One or more instructions can be stored.
  • a method of displaying a user interface by an electronic device includes an operation of receiving a first signal including motion information of the stylus pen from a stylus pen, based on the motion information to display a first user interface including a plurality of icons on the flexible display of the electronic device operating in a first form, and when a touch input to at least one of the plurality of icons is detected, the touch input is In response to an operation of identifying at least one application corresponding to the detected at least one icon, and a second signal transmitted from the stylus pen, the at least one identified application is executed, and the executed at least one application is executed. and displaying an execution screen of the on the flexible display.
  • an electronic device operates based on motion information of an input device (eg, a stylus pen, a digital pen, or a digitizer pen), thereby providing a user interface displayed immediately in a fixed state can do.
  • an input device eg, a stylus pen, a digital pen, or a digitizer pen
  • the input device may be detachable from at least a portion of a housing included in the electronic device.
  • the electronic device aligns the user interface based on a specified weight (eg, number of uses, recent execution information, association with a running application, and/or association with an input device), By displaying it, it is possible to provide convenience to the user.
  • a specified weight eg, number of uses, recent execution information, association with a running application, and/or association with an input device
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram of a display module according to various embodiments of the present disclosure.
  • FIG. 3 is a block diagram illustrating components of an electronic device according to various embodiments of the present disclosure.
  • FIG. 4 illustrates an electronic device that provides a user interface in various forms, according to various embodiments of the present disclosure.
  • FIG. 5 illustrates an electronic device that provides various user interfaces in a folding form, according to various embodiments of the present disclosure.
  • FIG 6 illustrates an electronic device that provides various user interfaces in an unfolded form, according to various embodiments of the present disclosure.
  • FIG. 7 illustrates an electronic device that provides various user interfaces in an unfolded form according to various embodiments of the present disclosure
  • FIG. 8 illustrates an electronic device that provides various user interfaces in an unfolded form according to various embodiments of the present disclosure
  • FIG 9 illustrates an electronic device terminating providing a user interface, according to various embodiments of the present disclosure.
  • FIG. 10 illustrates an electronic device that provides various user interfaces in a half-folding form according to various embodiments of the present disclosure.
  • FIG. 11 illustrates an electronic device that provides various user interfaces in a half-folding form, according to various embodiments of the present disclosure.
  • FIG. 12 illustrates an electronic device that provides various user interfaces in a half-folding form, according to various embodiments of the present disclosure.
  • FIG. 13 illustrates an electronic device that provides various user interfaces in a half-folding form, according to various embodiments of the present disclosure
  • FIG 14 illustrates an electronic device that provides various user interfaces, according to various embodiments of the present disclosure.
  • FIG 15 illustrates an electronic device that provides various user interfaces, according to various embodiments of the present disclosure.
  • 16 illustrates an electronic device that provides various user interfaces, according to various embodiments of the present disclosure.
  • 17 is a flowchart illustrating an operation of an electronic device according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the co-processor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • the display module 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210 .
  • the DDI 230 may include an interface module 231 , a memory 233 (eg, a buffer memory), an image processing module 235 , or a mapping module 237 .
  • the DDI 230 receives, for example, image data or image information including an image control signal corresponding to a command for controlling the image data from other components of the electronic device 101 through the interface module 231 . can do.
  • the image information is the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121 ( For example, a graphic processing device)
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the DDI 230 is the At least a portion of the received image information may be stored in the memory 233, for example, in units of frames, for example, the image processing module 235 may store at least a portion of the image data, Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based on at least the characteristics of the display 210.
  • Pre-processing or post-processing eg, resolution, brightness, or size adjustment
  • the mapping module 237 may perform pre-processing or post-processing through the image processing module 135.
  • a voltage value or a current value corresponding to the image data may be generated.
  • the generation of the voltage value or the current value may include, for example, a property of pixels of the display 210 (eg, an arrangement of pixels). RGB stripe or pentile structure), or the size of each sub-pixel)
  • At least some pixels of the display 210 may be at least partially based on, for example, the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 210 .
  • the display module 160 may further include a touch circuit 250 .
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251 .
  • the touch sensor IC 253 may control the touch sensor 251 to sense a touch input or a hovering input for a specific position of the display 210 , for example.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 210 .
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • At least a part of the touch circuit 250 is disposed as a part of the display driver IC 230 , or the display 210 , or outside the display module 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 210 or the DDI 230 ) or a part of the touch circuit 250 .
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 210 . (eg fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 210 .
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 210 , or above or below the pixel layer.
  • FIG. 3 is a block diagram 300 illustrating components of an electronic device 301 according to various embodiments of the present disclosure.
  • the electronic device 301 disclosed in this document may include at least some of the components of the electronic device described above with reference to FIG. 1 (eg, the electronic device 101 of FIG. 1 ).
  • the electronic device 301 may include a processor 320 (eg, processor 120 in FIG. 1 ), a memory 330 (eg, memory 130 in FIG. 1 ), and a pen input interface 350 (eg, in FIG. 1 ). to include: input module 150 of FIG. 1 ), display 360 (eg, display 160 of FIG. 1 ), and/or communication circuitry 390 (eg, communication module 190 of FIG. 1 ).
  • the components illustrated in FIG. 4 are exemplary, and the electronic device 301 may further include components not illustrated in FIG. 4 or may not include some of the illustrated components.
  • the processor 320 may be operatively coupled to the memory 330 , the pen input interface 350 , the display 360 , and/or the communication circuitry 390 , according to one embodiment.
  • the memory 330 may store one or more instructions that, when executed, cause the processor 320 to perform various operations of the electronic device 301 .
  • the pen input interface 350 may be configured to receive an input by the input device 302 (eg, a stylus pen, a digital pen, or a digitizer pen), according to an embodiment.
  • the pen input interface 350 may include a grid-shaped circuit pattern, and may be set to detect an input position of the input device 302 using the circuit pattern.
  • the pen input interface 350 may be embedded in the display 360 or located below the display 360 .
  • the display 360 may include a plurality of pixels.
  • the display 360 may further include a touch input detection circuit and/or a pen input detection circuit (eg, the pen input interface 350 ) for detecting a touch input (eg, a touch input by a user).
  • the processor 320 may distinguish and detect a touch input by an external object (eg, a user's hand) and an input by the input device 302 .
  • the processor 320 may detect a touch input using a touch input detection circuit, and may detect an input of the input device 302 using a pen input detection circuit. For example, the processor 320 may detect a hovering input from the input device 302 when the input device 302 is spaced apart within a specified distance from the display 360 .
  • the processor 320 may detect a pen input (eg, pen touch input or pen contact input) from the input device 302 when one end of the input device 302 is in contact with the display 360 .
  • a pen input eg, pen touch input or pen contact input
  • the processor 320 may detect a button input from the input device 302 .
  • the display 360 may include various types of displays.
  • the display may include a flexible display.
  • the flexible display may refer to a display in which at least a part is flexible.
  • the flexible display may be folded or unfolded based on a hinge structure included in the electronic device 301 .
  • the display 360 may display different screens in each area of the display that is physically and/or logically separated under the control of the processor 320 .
  • the electronic device 301 may be used in various forms. The form of use of the electronic device 301 according to the folding angle of the display 360 may be referred to in more detail in the description of FIG. 4 to be described later.
  • the communication circuit 390 may be configured to support short-range wireless communication based on a Bluetooth protocol (eg, legacy Bluetooth and/or BLE) and/or a wireless LAN.
  • a Bluetooth protocol eg, legacy Bluetooth and/or BLE
  • communication circuitry 390 may provide communication with input device 302 .
  • the electronic device 301 may further include components not shown in FIG. 3 .
  • the electronic device 301 may further include a housing.
  • the housing may include a magnetic pad for attachment of the input device 302 and/or a slot for insertion of the input device 302 .
  • the input device 302 may be referred to as a stylus pen, a digital pen, or a digitizer pen.
  • the input device 02 may receive an electromagnetic field signal (eg, a proximity signal) generated from a digitizer (eg, a pen input interface) of the electronic device 301 .
  • the input device 302 may receive an electromagnetic field signal using a resonant circuit.
  • the input device 302 may transmit an electromagnetic resonance (EMR) input signal to the electronic device 301 .
  • EMR electromagnetic resonance
  • the input device 302 may use at least one of an active electrostatic (AES) method and an electrically coupled resonance (ECR) method.
  • AES active electrostatic
  • ECR electrically coupled resonance
  • the input device 302 may generate a signal using the electronic device 301 and capacitive coupling.
  • the input device 302 transmits a signal by the ECR method
  • the input device 302 includes a resonance frequency based on an electric field generated from a capacitive device of the electronic device 301 . signal can be generated.
  • the input device 302 may include a communication circuit for communication with the electronic device 301 .
  • the input device 302 may communicate with the electronic device 301 using short-range wireless communication (eg, at least one of Bluetooth, Bluetooth low energy (BLE), or wireless LAN).
  • the input device 302 may include at least one button. When an input for at least one button is received, the input device 302 may transmit a signal corresponding to the button input to the electronic device 301 using a resonance circuit and/or a communication circuit.
  • the input device 302 may include at least one sensor (eg, an accelerator sensor, a piezo sensor, and/or a gyro sensor).
  • the input device 302 may obtain motion information related to the motion of the input device 302 using at least one sensor.
  • the input device 302 obtains motion information related to a movement (eg, shaking) of the input device 302 by an external (eg, a user), and transmits a control signal including the motion information electronically. may be transmitted to the device 301 .
  • FIG. 4 illustrates an electronic device 401 that provides a user interface in various forms, according to various embodiments of the present disclosure.
  • the electronic device 401 (eg, the electronic device 101 of FIG. 1 ) includes at least a part of the flexible display 460 (eg, FIG. 1 ) having flexibility. 1 display module 160).
  • the electronic device 401 may be used in various forms according to the folding angle of the flexible display 460 .
  • the electronic device 401 may receive a control signal from the input device 402 and provide various functions based on the control signal. For example, when an external input to the button 410 (eg, a pressure input by a user) is received, the input device 402 transmits a first signal corresponding to the external input to the button 410 to the electronic device ( 401) can be sent.
  • the first signal may include motion information of the input device 402 .
  • the input device 402 may detect a shaking of the input device 402 by the user using at least one sensor, and may acquire motion information based on the sensed information.
  • the input device 402 may transmit a first signal including motion information to the electronic device 401 .
  • the electronic device 401 may operate in a folding form.
  • the electronic device 401 may display various user interfaces on one area of the flexible display 460 .
  • the shape of the electronic device 401 operating in a folding shape may be defined as a first shape.
  • the electronic device 401 may receive a specified signal (eg, a first signal) from the input device 402 and display various user interfaces based on the specified signal. For example, when an external input to the button 410 included in the input device 402 is continuously detected, the electronic device 401 generates and transmits the first input device 402 in response to the external input. signal can be received.
  • a specified signal eg, a first signal
  • the motion information included in the first signal may include information on the number of times of shaking, shaking speed, and/or shaking direction of the input device 402 .
  • the electronic device 401 may display various user interfaces on one area of the flexible display 460 based on the motion information.
  • the electronic device 401 operating in the first form may display a plurality of icons (a first icon 451 , a second icon 452 , and a third icon ( 453)) may be displayed.
  • Each of the plurality of icons may be referred to as a graphic user interface (GUI) for executing a designated application based on a touch input (eg, a touch input by the input device 402 ).
  • GUI graphic user interface
  • the electronic device 401 may operate in an unfolding form.
  • the electronic device 401 may display various user interfaces on one area of the flexible display 460 .
  • the shape of the electronic device 401 operating in an unfolded shape may be defined as a second shape.
  • the electronic device 401 may display various user interfaces on one area of the flexible display 460 based on motion information of the input device 402 included in the first signal received from the input device 402 .
  • the electronic device 401 operating in the second form may display a plurality of images (a first image 461 , a second image 462 , and a third image 463 on at least one area of the flexible display 460 ).
  • Each of the plurality of images may be referred to as a GUI for executing a designated application based on a touch input.
  • the plurality of images may be referred to as thumbnail images displaying virtual execution screens of the specified applications.
  • the operation form of the electronic device 401 is divided into a first form and a second form, but this is exemplary and is not limited thereto.
  • the electronic device 401 may provide various user interfaces in the form of half-folding.
  • a shape of the electronic device 401 operating in a half-folding shape may be defined as a third shape.
  • the description of the electronic device 401 operating in the half-folding configuration may be referred to in more detail with reference to FIGS. 10 to 14 to be described later.
  • FIG. 5 illustrates an electronic device 501 that provides various user interfaces in a folding form, according to various embodiments of the present disclosure.
  • the operation form of the electronic device 501 is a first shape (eg, a first shape according to reference numeral 400a of FIG. 4 ).
  • the electronic device 501 may provide various user interfaces based on communication with the input device 502 (eg, the input device 402 of FIG. 4 ).
  • the electronic device 501 may receive a control signal (eg, a first signal) transmitted from the input device 502 and obtain motion information included in the first signal.
  • the electronic device 501 may provide various user interfaces to the flexible display 560 (eg, the display module 160 of FIG. 1 ).
  • the specified condition may include a number of shaking (shaking), a shaking speed, and/or a shaking direction of the input device 502 .
  • the number of times of shaking may be defined as the number of times the user shakes the input device 502 for a specified time.
  • the shaking speed may be defined as a speed or intensity at which the electronic device shakes the input device 502 based on a specified period.
  • the shaking direction may be defined as a direction in which the user shakes the input device 502 .
  • the control signal transmitted from the input device 502 may be transmitted to the input device 502 when an external input is sensed to a button 510 included in the input device 502 (eg, the button 410 in FIG. 4 ).
  • a button 510 included in the input device 502 eg, the button 410 in FIG. 4
  • the input device 502 continuously transmits the first signal including the motion information of the input device 502 acquired using at least one sensor to electronic may be transmitted to the device 501 .
  • the electronic device 501 may receive the first signal from the input device 502 using a communication circuit (eg, the communication circuit 390 of FIG. 3 ).
  • the electronic device 501 receives the first signal, obtains motion information included in the first signal, and displays various user interfaces on one area of the flexible display 560 based on the motion information.
  • the user interface may include a plurality of icons (eg, a first icon 551 , a second icon 552 , a third icon 553 , a fourth icon 554 , a fifth icon 555 , and a sixth icon 556).
  • the plurality of icons may be referred to as a graphical user interface (GUI) for executing a designated application based on a touch input (eg, a touch input by a user or a touch input by the input device 502 ).
  • GUI graphical user interface
  • the electronic device 501 may obtain a first signal including information related to the movement 505 of the input device 502 from the input device 502 .
  • the information about the movement 505 of the input device 502 may include information about the number of times of shaking, the shaking speed, and/or the shaking direction of the input device 502 .
  • the electronic device 501 may change and display the user interface on the flexible display 560 .
  • the input device 502 may use at least one It is possible to identify that the shaking speed has been increased using the sensor, and transmit a first signal including the changed motion information to the electronic device 501 .
  • the electronic device 501 may receive the first signal and display additional icons (eg, not shown) on the flexible display 560 .
  • the electronic device 501 may additionally display new icons in addition to the plurality of icons 551 to 556 that are previously displayed on the flexible display 560 .
  • the electronic device 501 when the shaking speed of the input device 502 decreases (eg, when a user performs a shaking gesture operation using the input device 502 at a slower cycle), the electronic device 501 previously At least one of the plurality of icons 551 to 556 being displayed may not be displayed any more.
  • the electronic device 501 may display a plurality of icons included in the user interface differently based on weights stored in a memory (eg, the memory 330 of FIG. 3 ).
  • the weight may include the number of times and/or usage history of each of the applications provided by the electronic device 501 .
  • an icon eg, a first icon 551
  • an icon eg, a second icon 552 to a second icon 552
  • a user interface having a relatively wider area than the 6 icon 556) may be displayed on the flexible display 560 .
  • the electronic device 501 may display a user interface on the flexible display 560 in which an icon corresponding to the most recently used application has a relatively wider area than other icons.
  • the electronic device 501 may execute a specified function based on a touch input to at least one area of the user interface. For example, the electronic device 501 detects a touch input (eg, a touch input by a user and/or a pen touch input by the input device 502 ) for the first icon 551 included in the user interface. In this case, an application corresponding to the first icon 551 may be executed and displayed on the flexible display 560 . As another example, when a pen touch input by the input device 502 is sensed in at least one of a plurality of icons while an external input is detected on the button 510 of the input device 502 , the electronic device 501 .
  • a touch input eg, a touch input by a user and/or a pen touch input by the input device 502
  • an application corresponding to the first icon 551 may be executed and displayed on the flexible display 560 .
  • the electronic device 501 when a pen touch input by the input device 502 is sensed in at least one of a plurality of
  • the electronic device 501 may be displayed on the flexible display 560 by executing applications corresponding to at least one of the plurality of icons. For example, when the electronic device 501 detects a pen touch input for at least one of the plurality of icons while receiving the first signal from the input device 502 , the plurality of icons in which the touch input is sensed Applications corresponding to at least one of them may be executed and displayed on the flexible display 560 . In other words, the electronic device 501 may display a user interface in which a plurality of icons have different regions based on a specified weight.
  • the user interface displayed by the electronic device 501 is illustrated as including six icons 551 to 556 , but is not limited thereto. Meanwhile, a description of an execution screen in which the electronic device 601 executes at least one application based on a touch input to at least one of the plurality of icons may be described with reference to FIG. 7 to be described later.
  • FIG. 6 illustrates an electronic device 601 that provides various user interfaces in an unfolded form according to various embodiments of the present disclosure.
  • the operation form of the electronic device 601 (eg, the electronic device 101 of FIG. 1 ) illustrated by reference numeral 600 is a second shape (eg, the second shape according to reference numeral 400b of FIG. 4 ). ) can be defined as The electronic device 601 may provide various user interfaces based on communication with the input device 602 (eg, the input device 402 of FIG. 4 ).
  • the electronic device 601 receives a first signal including motion information related to a motion 605 of the input device 602 , and based on the motion information, an area of the flexible display 660 .
  • the user interface may include a plurality of images (eg, a first image 661 , a second image 662 , a fourth image 664 , a fifth image 665 , a sixth image 666 , a sixth image 666 and a seventh image 667).
  • the plurality of images may be referred to as a graphical user interface (GUI) for executing a designated application based on a touch input (eg, a touch input by a user or a touch input by the input device 602 ).
  • GUI graphical user interface
  • the electronic device 601 may receive a first signal including information related to the movement 605 of the input device 602 .
  • the information about the movement 605 of the input device 602 may include information about the number of shaking (shaking), shaking speed, and/or shaking direction of the input device 602 .
  • the electronic device 601 may change and display the user interface on the flexible display 660 .
  • the input device 602 when the shaking speed of the input device 602 increases (eg, when a user performs a shaking gesture operation by using the input device 602 at a faster cycle), the input device 602 performs at least one An increase in the shaking gesture speed may be identified using the sensor, and a first signal including the changed motion information may be transmitted to the electronic device 601 .
  • the electronic device 601 may display additional thumbnails (eg, not shown) on the flexible display 660 based on the changed motion information.
  • the electronic device 601 when the shaking speed of the input device 602 increases, the electronic device 601 may additionally display new images in addition to the plurality of images 661 to 667 that are previously displayed on the flexible display 660 .
  • the electronic device 601 may no longer display at least one of a plurality of images (reference numerals 661 to 667) that are being displayed.
  • the electronic device 601 may display a plurality of images included in the user interface differently based on weights stored in a memory (eg, the memory 330 of FIG. 3 ).
  • the weight may include the number of times and/or usage history of each of the applications provided by the electronic device 601 .
  • an image eg, the first image 661
  • an image eg, the second image 662 to the second image 662
  • the electronic device 601 may display on the flexible display 660 a user interface in which an image corresponding to the most recently used application has a relatively wider area than other images.
  • the electronic device 601 may execute a specified function based on a touch input to at least one region of the user interface. For example, the electronic device 601 detects a touch input (eg, a touch input by a user and/or a pen touch input by the input device 602 ) with respect to the first image 661 included in the user interface. In this case, an application corresponding to the first image 661 may be executed and displayed on the flexible display 660 .
  • a touch input eg, a touch input by a user and/or a pen touch input by the input device 602
  • an application corresponding to the first image 661 may be executed and displayed on the flexible display 660 .
  • the electronic device 601 when a pen touch input by the input device 602 is sensed in at least one of a plurality of images while an external input is detected on the button 610 of the input device 602 , the electronic device 601 ) may be displayed on the flexible display 660 by executing applications corresponding to at least one of the plurality of images. For example, when the electronic device 601 detects a pen touch input for at least one of a plurality of images while receiving the first signal from the input device 602 , the plurality of images in which the touch input is sensed Applications corresponding to at least one of them may be executed and displayed on the flexible display 660 . In other words, the electronic device 601 may display a user interface in which a plurality of images have different regions based on a specified weight.
  • the user interface displayed by the electronic device 601 is illustrated as including seven images 661 to 667 , but is not limited thereto. Meanwhile, a description of an execution screen in which the electronic device 601 executes at least one application based on a touch input to at least one of a plurality of images may be described with reference to FIG. 7 to be described later.
  • FIG 7 illustrates an electronic device 701 that provides various user interfaces in an unfolded form according to various embodiments of the present disclosure.
  • the electronic device 701 applies applications to different areas of the flexible display (eg, the display 460 of FIG. 4 ).
  • the execution screen can be displayed.
  • the electronic device 701 may display an application execution screen having different regions on the flexible display based on a weight stored in a memory (eg, the memory 330 of FIG. 3 ).
  • the division of the area of the flexible display described in FIG. 7 may be a division of a physical area based on a folding structure (eg, a hinge structure) of the flexible display, or a division of a logical area based on an execution screen. .
  • FIG. 7 shows an electronic device displaying a user interface in a first form (eg, a first form according to reference numeral 400a of FIG. 4 , a folding form) using a plurality of icons in the user interface.
  • Receive a touch input eg, a touch input by a user or an input device (eg, a pen touch input by the input device 402 of FIG. 4 ) for at least one of (eg, reference numerals 551 to 556 of FIG. 5 );
  • a touch input eg, a touch input by a user or an input device (eg, a pen touch input by the input device 402 of FIG. 4 ) for at least one of (eg, reference numerals 551 to 556 of FIG. 5 );
  • the electronic device displaying the user interface in the second form according to the , an unfolding form) receives a touch input
  • a touch input For example, reference may be made to a diagram illustrating an electronic device 701 that receives a touch input by a user or a pen touch input by an input device and executes at least one application determined based on the touch input.
  • the device 701 may display an execution screen on the flexible display by executing at least one application determined based on the touch input in response to the second signal transmitted from the input device.
  • the electronic device 701 may execute two applications determined based on a touch input.
  • the electronic device 701 may display execution screens of the first application and the second application in the first area 761 and the second area 762 of the flexible display, respectively.
  • the electronic device 701 receives a first signal while displaying a plurality of icons in a first form, and includes a first icon (eg, a first icon 551 in FIG. 5 ) and a second icon ( Example: A touch input to the second icon 552 of FIG. 5 ) may be detected.
  • the electronic device 701 may receive a second signal from the input device.
  • the electronic device 701 may execute the applications in different areas 761 and 762 in response to the received second signal.
  • the electronic device 701 receives the first signal while displaying the plurality of images in the second form, and includes a first image (eg, the first image 661 of FIG. 6 ) and a second image ( Example: A touch input to the second image 662 of FIG. 6 ) may be detected.
  • the electronic device 701 may receive a second signal from the input device.
  • the electronic device 701 may execute the applications in different areas 761 and 762 in response to the received second signal.
  • the electronic device 701 may execute three applications determined based on a touch input. For example, the electronic device 701 executes the first application, the second application, and the third application in the first area 761 , the second area 762 , and the third area 763 of the flexible display, respectively. screen can be displayed.
  • the electronic device 701 may execute four applications determined based on a touch input. For example, the electronic device 701 provides a first application and a second application in the first area 761 , the second area 762 , the third area 763 , and the fourth area 764 of the flexible display, respectively. , the third application, and the execution screen of the fourth application may be displayed.
  • first region 761 and second region 762 of reference number 700a, second region 762 and third region 763 of reference number 700b, and all regions of reference number 700c are identical to each other.
  • the electronic device 701 may display an application execution screen in areas having different sizes.
  • each area is illustrated as having a rectangular shape, this is also exemplary, and the electronic device 701 may display an application execution screen in a changed shape area based on a user's setting or a specified input.
  • FIG 8 illustrates an electronic device 801 that provides various user interfaces in an unfolded form according to various embodiments of the present disclosure.
  • the electronic device 801 moves the input device 802 (eg, the input device 302 of FIG. 3 ).
  • a user interface having various structures may be displayed based on the information.
  • the electronic device 801 may receive a first signal including motion information of the input device 802 from the input device 802, change the structure of the user interface based on the motion information, and display it. there is.
  • the electronic device 801 may change and display the structure of the user interface based on shaking direction information included in the motion information of the input device 802 .
  • the electronic device 801 is configured to include motion information of the input device 802 that is repeatedly shaken in the first direction 811 and the second direction 812 using at least one sensor. 1 signal can be received.
  • the user may repeatedly perform a shaking gesture in the first direction 811 and the second direction 812 while holding the input device 802 .
  • the input device 802 obtains first motion information of the input device 802 that is repeatedly shaken in the first direction 811 and the second direction 812 using at least one sensor, and the first motion information A first signal including , may be transmitted to the electronic device 801 .
  • the electronic device 801 may display a user interface based on first motion information including the first direction 811 and the second direction 812 .
  • the electronic device 801 arranges the first area 861 on the right side of the flexible display (eg, the display 460 of FIG. 4 ) based on the first motion information, and the second area 862 and An application execution screen may be displayed by arranging the third area 863 on the left side.
  • the electronic device 801 may receive a first signal including motion information of the input device 802 that is repeatedly shaken in the third direction 813 and the fourth direction 814 . .
  • the user may repeatedly shake the input device 802 in the third direction 813 and the fourth direction 814 while gripping the input device 802 .
  • the input device 802 transmits a first signal including second motion information of the input device 802 that is repeatedly shaken in the third direction 813 and the fourth direction 814 using at least one sensor to the electronic device. It can be sent to (801).
  • the electronic device 801 may display the user interface based on the second motion information including the third direction 813 and the fourth direction 814 .
  • the electronic device 801 arranges the first area 861 on the left side of the flexible display and the second area 862 and the third area 863 on the right side of the flexible display based on the second motion information.
  • the application execution screen can be displayed.
  • the division of the area of the application execution screen shown in FIG. 8 is exemplary, and various embodiments of the present document are not limited thereto.
  • the electronic device 801 may display an application execution screen by dividing the second area 862 and the third area 863 to have different sizes.
  • FIG 9 illustrates an electronic device 901 terminating providing a user interface, according to various embodiments of the present disclosure.
  • the electronic device 901 terminates the provision of the user interface based on the change of the operation form according to the change of the folding angle. can do.
  • the electronic device 901 may operate in a second shape (eg, a second shape according to reference numeral 400b of FIG. 4 , an unfolding shape).
  • the electronic device 901 receives a first signal transmitted from the input device 902 (eg, the input device 302 of FIG. 3 ), and obtains motion information of the input device 902 included in the first signal can do. For example, when it is determined that the motion information satisfies a specified condition, the electronic device 901 displays a plurality of images (eg, FIG. 3 ) on the flexible display 960 (eg, the flexible display 360 of FIG. 3 ). 6, reference numerals 661 to 667).
  • the electronic device 901 may operate in a third mode.
  • the third form may be referred to as a half-folding form in which the flexible display is folded at a specified angle.
  • the electronic device 901 changes and displays the plurality of images displayed on the flexible display 960 into icons (eg, a water droplet icon) of a different form. can do.
  • the electronic device 901 may remove the icons displayed on the flexible display 960 .
  • the electronic device 901 may remove the icon in which the designated input is sensed based on a designated input for the icons (eg, a drag input by a user or a pen drag input).
  • a designated input for the icons e.g, a drag input by a user or a pen drag input.
  • the electronic device 901 may sequentially remove the icons according to a specified weight. For example, when it is determined that the display time of the icons exceeds a specified time, the electronic device 901 may sequentially remove at least one icon from an icon corresponding to an application having the lowest number of uses.
  • FIG. 10 illustrates an electronic device 1001 that provides various user interfaces in a half-folding form according to various embodiments of the present disclosure.
  • an operation form of the electronic device 1001 (eg, the electronic device 101 of FIG. 1 ) illustrated by reference numeral 1000 may be defined as a third form.
  • the third form refers to a form in which at least a portion of a flexible display (eg, the display 360 of FIG. 3 ) included in the electronic device 1001 is folded at a specified angle (eg, 90 degrees to 150 degrees).
  • the electronic device 1001 may provide various user interfaces based on communication with the input device 1002 (eg, the input device 402 of FIG. 4 ).
  • the electronic device 1001 may receive a control signal transmitted from the input device 1002 .
  • the input device 1002 obtains motion information related to the motion 1005 of the input device 1002 using at least one sensor (eg, an acceleration sensor, a piezo sensor, and/or a gyro sensor), and collects the motion information.
  • the included control signal may be transmitted to the electronic device 1001 .
  • the electronic device 1001 displays at least a portion of the flexible display (eg, the first display area 1060a and/or the second display area 1060b). ) can provide various user interfaces.
  • the specified condition may include the number of shaking (shaking) of the input device 1002 by the user, a shaking speed, and/or a shaking direction.
  • the control signal transmitted from the input device 1002 is input device 1002 when an external input is sensed to a button 1010 (eg, button 410 in FIG. 4 ) included in the input device 1002 .
  • a button 1010 eg, button 410 in FIG. 4
  • the input device 1002 may continuously transmit the first signal to the electronic device 1001 .
  • the electronic device 1001 may receive the first signal from the input device 1002 using a communication circuit (eg, the communication circuit 390 of FIG. 3 ).
  • the electronic device 1001 receives a first signal including motion information related to a motion 1005 of the input device 1002 , and based on the motion information, an area of the flexible display 1060 .
  • Various user interfaces may be displayed on (eg, the first display area 1060a and/or the second display area 1060b).
  • the user interface may include a plurality of icons (eg, a first icon 1081 , a second icon 1082 , a third icon 1083 , a fourth icon 1084 , a fifth icon 1085 , and a sixth icon 1086).
  • the user interface may include a plurality of images (eg, a first image 1071 , a second image 1072 , and a third image 1073 ).
  • a plurality of images eg, a first image 1071 , a second image 1072 , and a third image 1073 .
  • the plurality of icons and the plurality of images are displayed as a graphical user interface (GUI) for executing a designated application based on a touch input (eg, a touch input by a user or a touch input by the input device 1002 ).
  • GUI graphical user interface
  • the electronic device 1001 may divide regions of the flexible display and include different user interfaces in the divided regions. For example, the electronic device 1001 may divide the flexible display into a first display area 1060a and a second display area 1060b based on a folding part of the flexible display. For example, the electronic device 1001 displays a plurality of images 1071 to 1073 on the first display area 1060a and displays a plurality of icons 1081 to 1086 on the second display area 1060b. can do.
  • the electronic device 1001 may display various user interfaces based on motion information of the input device 1002 and/or weights stored in a memory (eg, the memory 330 of FIG. 3 ).
  • the description of the electronic device 1001 that displays various user interfaces based on the motion information and/or weights may be replaced with the description disclosed in FIGS. 4 and 5 .
  • the electronic device 1001 may execute a specified function based on a touch input to at least one area of the user interface. For example, the electronic device 1001 performs a touch input (eg, a touch input by a user and/or a pen touch by the input device 1002 ) for the first thumbnail 1071 displayed on the first display area 1060a. input) is detected, an application corresponding to the first image 1071 may be executed and displayed on the first display area 1060a.
  • a touch input eg, a touch input by a user and/or a pen touch by the input device 1002
  • an application corresponding to the first image 1071 may be executed and displayed on the first display area 1060a.
  • the electronic device 1001 may execute applications corresponding to at least one of the plurality of images and/or icons and display them on the first display area 1060a and/or the second display area 1060b.
  • the electronic device 1001 detects a pen touch input for at least one of a plurality of images and/or icons while receiving the first signal from the input device 1002 , the touch input is Applications corresponding to at least one of the detected plurality of images and/or icons may be executed and displayed on the first display area 1060a and/or the second display area 1060b.
  • FIG. 11 illustrates an electronic device 1101 that provides various user interfaces in a half-folding form according to various embodiments of the present disclosure.
  • an operation form of the electronic device 1101 (eg, the electronic device 101 of FIG. 1 ) illustrated by reference numeral 1100 may be defined as a third form (eg, the third form of FIG. 10 ).
  • the flexible display of the electronic device 1101 operating in the third shape may be divided into a first area 1160a and a second area 1160b.
  • the above division of the flexible display may be a division of a physical area based on a folding structure (eg, a hinge structure) or a division of a logical area based on an execution screen.
  • the electronic device 1101 may sequentially perform the operations illustrated by reference numerals 1100a, 1100b, and 1100c based on a designated input.
  • the electronic device 1101 may display various user interfaces on the flexible display. For example, the electronic device 1101 obtains motion information of the input device 1102 based on a first signal transmitted from the input device 1102 (eg, the input device 402 of FIG. 4 ), and the movement When it is determined that the information satisfies the specified condition, various user interfaces may be displayed on the flexible display. For example, the electronic device 1101 displays a plurality of images (eg, a first thumbnail 1171 , a second thumbnail 1172 , and a third image 1173 ) on the first display area 1160a, A plurality of icons (eg, the first icon 1181) may be displayed on the second display area 1160b.
  • a plurality of images eg, a first thumbnail 1171 , a second thumbnail 1172 , and a third image 1173
  • the electronic device 1101 may store information about a plurality of applications (eg, the number of times of use, a history of use, and/or a correlation with the input device 1102 ) in a memory.
  • the electronic device 1101 may determine images and/or icons displayed on the first display area 1160a and the second display area 1160b based on the information on the plurality of applications. For example, the electronic device 1101 may display a plurality of images corresponding to applications having a high frequency of use on the first display area 1160a.
  • the electronic device 1101 may receive an input for the flexible display and determine an application corresponding to the input. For example, the electronic device 1101 receives a first signal from the input device 1102 , and displays a first image 1111 and a second image 1112 and a second display displayed on the first display area 1160a. Pen touch inputs 1111 , 1112 , and 1113 for the first icon 1181 displayed in the area 1160b may be detected. For example, the electronic device 1101 may detect the pen touch inputs 1111 , 1112 , and 1113 and receive a second signal.
  • the electronic device 1101 may determine an application corresponding to the image and icon detected by the pen touch inputs 1111 , 1112 , and 1113 as an application to be executed.
  • the first signal is transmitted to the electronic device 1101 by the input device 1102 on the basis of receiving an external input (eg, a pressure input by a user) for a button (eg, the button 410 in FIG. 4 ). It can be referred to as a transmitted control signal.
  • the second signal may be referred to as a control signal transmitted by the input device 1102 to the electronic device 1101 in response to termination of an external input for a button.
  • the input device 1102 may generate a first signal and transmit it to the electronic device 1101 . Thereafter, when the user no longer presses the button of the input device 1102 , the input device 1102 may generate a second signal and transmit it to the electronic device 1101 .
  • the electronic device 1101 may display execution screens of applications determined by reference number 1100b in the first display area 1160a and the second display area 1160b.
  • the electronic device 1101 displays execution screens in various arrangements based on information about a plurality of applications stored in the memory (eg, the number of times of use, use history, identification information, and/or correlation with the input device 1102 ). can do.
  • the electronic device 1101 may display an image and/or an execution screen of an application that displays an image (eg, a drawing application) on the first display area 1160a.
  • the electronic device 1101 may display an execution screen of an application having a high correlation with the input device 1102 among the determined applications on the second display area 1160b.
  • the electronic device 1101 may display an execution screen of an application (eg, an Internet application) in which a touch operation is frequently used on the second display area 1160b based on the identification information of the application.
  • FIG. 12 illustrates an electronic device 1201 that provides various user interfaces in a half-folding form, according to various embodiments of the present disclosure.
  • the electronic device 1201 arranges thumbnails and/or icons included in a user interface based on a designated touch input.
  • the electronic device 1201 displays the first image 1271 , the second image 1272 , and the third image 1273 on the first display area 1260a, and the second display area ( A plurality of icons including the first icon 1281 and the second icon 1282 may be displayed in 1260b).
  • the first image 1271 , the second image 1272 , and the third image 1273 may be GUIs corresponding to the first application, the second application, and the third application, respectively.
  • the first icon 1281 and the second icon 1282 may be GUIs corresponding to the fourth application and the fifth application, respectively.
  • the electronic device 1201 arranges the user interface in response to a drag input by the input device 1202 (eg, the input device 402 of FIG. 4 ). You can change the structure. For example, when a drag input to at least a part of the first display area 1260a is sensed from the area corresponding to the first icon 1281 , the electronic device 1201 displays the first display area 1260b in the second display area 1260b. The icon 1281 may be deleted and a new icon may be displayed. For example, in response to a drag input directed from the first icon 1281 to at least a portion of the first display area 1260a, the electronic device 1201 displays the first icon 1281 and the third image 1273 .
  • the display area can be changed.
  • the electronic device 1201 may determine an image whose display area is to be changed in the first display area 1260a based on a specified weight.
  • the electronic device 1201 changes and displays the first icon 1281 with the display area changed to the first display area 1260a into an image, and the third image 1273 with the display area changed to the second display area 1260b. can be displayed by changing to an icon.
  • the electronic device 1202 may delete one of the plurality of images displayed on the first display area 1260a (eg, the third image 1273 ) and display a new image.
  • the electronic device 1202 may display the first display area 1260a based on weights stored in the memory (eg, the number of times of use, recent execution information, association with a running application, and/or association with an input device). ) may determine at least one image to be deleted.
  • the new icon may be referred to as an icon corresponding to the third application.
  • the new image may be an image corresponding to the fourth application.
  • the electronic device 1201 arranges the user interface in response to a drag input by the input device 1202 (eg, the input device 402 of FIG. 4 ). You can change the structure. For example, a pen touch input is detected in areas corresponding to the second image 1272 , the third image 1273 , the first icon 1281 , and the second icon 1282 , and the second display area 1260b ), when a drag input to at least a portion of the first display area 1260a is sensed from at least a portion of the first display area 1260a, the electronic device 1201 may delete the selected images and icons from each display area.
  • a pen touch input is detected in areas corresponding to the second image 1272 , the third image 1273 , the first icon 1281 , and the second icon 1282 , and the second display area 1260b )
  • the electronic device 1201 may delete the selected images and icons from each display area.
  • the electronic device 1202 may display icons corresponding to the second application and the third application on the second display area 1260b. Also, the electronic device 1202 may display images corresponding to the fourth application and the fifth application on the first display area 1260a. In other words, in response to a drag input to at least a portion of the first display area 1260a and the second display area 1260b, the electronic device 1202 displays at least a portion of at least one image or icons that are being displayed in advance. can be displayed by changing (eg, switching) the display area of .
  • FIG. 13 illustrates an electronic device 1301 that provides various user interfaces in a half-folding form according to various embodiments of the present disclosure.
  • the electronic device 1301 receives a specified input, executes an application determined based on the specified input, and displays an execution screen of the application on a flexible display (eg, in FIG. 4 ). It may be displayed on one area of the display 460). For example, the electronic device 1301 may display a plurality of images 1371 , 1372 , and 1373 on the first display area 1360a , and display a plurality of icons on the second display area 1360b . .
  • the electronic device 1301 may receive a touch input 1311 by the input device 1302 in one area of the flexible display.
  • the touch input 1311 may be at least one of a single tap, a double tap, and a long press input.
  • the electronic device 1301 may receive a touch input 1311 for one area of the first thumbnail 1371 displayed on the first display area 1360a.
  • the electronic device 1301 may execute an application determined based on the touch input 1311 and display an execution screen on the first area 1321 in the first display area 1360a.
  • the application may be referred to as a first application corresponding to the first image 1371 in which the touch input 1311 is sensed.
  • the electronic device 1301 provides a first signal transmitted from the input device 1302 and/or a touch input 1311 by the input device 1302 on one region of the flexible display. and 1313).
  • the first signal is generated by the input device 1302 and transmitted to the electronic device 1301 when an external input is detected on the button 1310 (eg, the button 410 in FIG. 4 ) included in the input device 1302 .
  • It can be referred to as a control signal to
  • the touch input may be at least one of a single tap, a double tap, or a long press input.
  • the electronic device 1301 receives a first signal transmitted from the input device 1302 , and includes one of the first image 1371 and the third image 1373 displayed on the first display area 1360a. Touch inputs 1311 and 1313 for the region may be received.
  • the electronic device 1301 executes applications determined based on the touch inputs 1311 and 1313 , and displays the first and second areas 1321 and 1323 in the first display area 1360a. Each execution screen can be displayed.
  • the applications may be referred to as a first application and a third application corresponding to the first image 1371 and the third thumbnail 1373 in which the touch input is sensed.
  • the electronic device may display an execution screen of a first application in the first area 1321 and may display an execution screen of a second application in the second area 1323 .
  • the electronic device 1301 may delete the second image 1372 in which the touch inputs 1311 and 1313 are not sensed from the first display area 1360a among the plurality of images.
  • the second image 1372 may be referred to as a GUI corresponding to the second application.
  • the electronic device 1301 may display the second icon 1382 on the second display area 1360b.
  • the second icon 1382 may be referred to as a GUI corresponding to the second application.
  • FIG. 14 illustrates an electronic device 1401 that provides various user interfaces according to various embodiments of the present disclosure.
  • the electronic device 1401 communicates with the input device 1402 (eg, the input device 302 of FIG. 3 ). Based on this, a user interface providing various functions may be displayed. For example, the electronic device 1401 may provide a drawing function that generates a drawing image based on a touch input by the input device 1402 . The electronic device 1401 receives a touch input from the input device 1402 and displays various types of drawing images generated based on the touch input on the flexible display 1460 (eg, the display 360 of FIG. 3 ). can be displayed
  • the electronic device 1401 may display a drawing image based on a property set for the touch input based on communication with the input device 1402 .
  • the electronic device may receive a first signal transmitted from the input device 1402 and obtain motion information of the input device 1402 included in the first signal.
  • the first signal is input by the input device 1402 using at least one sensor when an external input is sensed to the button 1410 (eg, the button 410 in FIG. 4 ) included in the input device 1402 .
  • It may be referred to as a control signal that obtains motion information related to the motion of the device 1402 , generates to include the motion information, and transmits the motion information to the electronic device 1401 .
  • the electronic device 1401 may change a property set for the touch input.
  • the attribute may include, for example, color, thickness, pattern, size, and/or shape.
  • the electronic device 1401 receives a first signal from the input device, obtains motion information of the input device 1402 , and determines that the shaking speed of the input device 1402 included in the motion information is
  • a drawing image generated based on the touch input may be displayed after setting the property for the touch input as the first property.
  • the first property may be referred to as a property for displaying the drawing image with a relatively larger thickness compared to a preset property.
  • FIG 15 illustrates an electronic device 1501 that provides various user interfaces, according to various embodiments of the present disclosure.
  • the electronic device 1501 (eg, the electronic device 1501 of FIG. 1 ) provides location information (eg, GPS information) or communication information (eg, Bluetooth) communication. information), a user interface that provides various functions may be displayed.
  • location information eg, GPS information
  • communication information eg, Bluetooth
  • the electronic device 1501 may perform Bluetooth communication with an external device (eg, a car).
  • the electronic device 1501 may receive a first signal from the input device 1502 (eg, the input device 302 of FIG. 3 ) while performing Bluetooth communication with the external device.
  • the electronic device 1501 may receive a first signal including motion information of the input device 1502 .
  • the electronic device 1501 displays at least one application (eg, a navigation application 1561) on the flexible display 1560 (eg, the display 360 of FIG. 3 ); A music application 1562 and/or an SNS application 1563) may be displayed.
  • the at least one application may be referred to as an application determined by the electronic device 1501 based on location information and/or communication information of the electronic device 1501 .
  • FIG. 16 illustrates an electronic device 1601 that provides various user interfaces according to various embodiments of the present disclosure.
  • the electronic device 1601 executes various applications while the input device 1602 (eg, the electronic device 101 of FIG. 3 ) is running.
  • Various functions may be provided based on communication with the input device 302).
  • the electronic device 1601 may display a map application execution screen on a flexible display (eg, the display 360 of FIG. 3 ).
  • an operation of the electronic device 1601 providing various functions based on communication with the input device 1602 may be sequentially described.
  • the electronic device 1601 receives the first signal transmitted from the input device 1602, and displays the first signal in one area of the first user interface 1621 displayed on the flexible display.
  • a touch input of the input device 1602 may be received.
  • the first user interface 1621 may include a plurality of GUIs corresponding to a search function and/or a theme selection function.
  • the electronic device 1601 may receive a touch input from the input device 1602 for at least one theme among the theme selection functions included in the first user interface 1621 .
  • the electronic device 1601 may identify one theme among a plurality of themes based on the touch input.
  • the electronic device 1601 may obtain motion information related to the motion 1605 of the input device 1602 based on a first signal transmitted from the input device 1602. can For example, when it is determined that the shaking speed of the input device 1602 included in the motion information satisfies a specified condition, the electronic device 1601 may display the second user interface 1622 . .
  • the second user interface 1622 may include place information corresponding to the identified theme.
  • the electronic device 1601 may display a place corresponding to the place information included in the second user interface 1622 on the map screen.
  • the electronic device 1601 may display a place corresponding to the place information on a map screen displayed on the flexible display 1660 as a plurality of GUIs 1623 (eg, icons).
  • GUIs 1623 eg, icons
  • the electronic device 1601 sets the plurality of GUIs 1623 based on a specified weight (eg, the number of searches and/or visits). At least some of the GUI can be displayed by changing the size.
  • FIG. 17 illustrates an operation flowchart 1700 of an electronic device according to various embodiments of the present disclosure.
  • the electronic device may perform the operations of FIG. 17 .
  • the processor of the electronic device eg, the processor 120 of FIG. 1
  • the processor of the electronic device may be set to perform the operations of FIG. 17 when instructions stored in the memory (eg, the memory 130 of FIG. 1 ) are executed. .
  • the electronic device may acquire motion information of the stylus pen in response to a first signal transmitted from the stylus pen (eg, the input device 302 of FIG. 3 ).
  • the first signal is when an external input is detected on a button (eg, button 410 of FIG. 4 ) included in the stylus pen
  • the stylus pen acquires movement information of the stylus pen using at least one sensor and may be referred to as a control signal generated to include the motion information.
  • the movement information of the stylus pen may include information related to the number of times of shaking of the stylus pen, a shaking speed, and/or a shaking direction of the stylus pen.
  • the at least one sensor may include an acceleration sensor, a piezo sensor, and/or a gyro sensor.
  • the electronic device may display a user interface on the flexible display based on the motion information. For example, when the electronic device operates in the first form, the electronic device may display a first user interface including a plurality of icons (eg, reference numerals 551 to 556 of FIG. 5 ) on one area of the flexible display. can As another example, when the electronic device operates in the second form, the electronic device displays a second user interface including a plurality of thumbnails (eg, reference numerals 661 to 667 of FIG. 6 ) on one area of the flexible display. can do.
  • a user interface including a plurality of icons (eg, reference numerals 551 to 556 of FIG. 5 ) on one area of the flexible display.
  • a second user interface including a plurality of thumbnails (eg, reference numerals 661 to 667 of FIG. 6 ) on one area of the flexible display. can do.
  • the electronic device displays a first user interface including a plurality of icons on a first display area of the flexible display and a plurality of thumbnails on a second display area It is possible to display a second user interface including
  • the electronic device may store data related to execution information of a plurality of applications in a memory (eg, the memory 130 of FIG. 1 ).
  • the electronic device may generate a weight for each application based on the data, and display a user interface determined based on the weight.
  • the electronic device may receive a touch input for an area of the user interface, and identify at least one application based on the touch input. For example, when a touch input for at least one of a plurality of icons and/or a plurality of thumbnails is detected, the electronic device may detect at least one icon and/or at least one thumbnail corresponding to the detected touch input. One application can be identified.
  • the electronic device may receive a second signal from the stylus pen.
  • the second signal may be referred to as a control signal transmitted from the stylus pen when an external input to a button included in the stylus pen is no longer detected.
  • the electronic device when the electronic device receives the second signal from the stylus pen (eg, operation 1720 - Yes), the electronic device may perform operation 1725 .
  • the electronic device may continue to perform operation 1710 .
  • the electronic device may display an execution screen of at least one identified application on the flexible display.
  • the electronic device may display execution screens having various structures on the flexible display.
  • the electronic device may store a correlation between a plurality of applications and the stylus pen in the memory.
  • the electronic device may determine an area in which to display an execution screen of at least one application based on the correlation.
  • the electronic device may display an application having a relatively high correlation with the stylus pen on the first display area, and display an application having a relatively low correlation with the stylus pen on the second display area.
  • an electronic device includes a flexible display, a wireless communication circuit, a processor, a stylus pen that is detachably inserted into at least one region inside the housing and communicates with the wireless communication circuit, and the processor operates It may include a memory that is operatively connected.
  • the processor when the memory is executed, receives a first signal including motion information of the stylus pen from the stylus pen, and operates in a first form based on the motion information
  • a first user interface including a plurality of icons is displayed on the flexible display of the electronic device, and when a touch input to at least one of the plurality of icons is detected, the first user interface corresponds to the at least one icon in which the touch input is sensed identify at least one application to be executed, execute the identified at least one application in response to a second signal transmitted from the stylus pen, and display an execution screen of the executed at least one application on the flexible display may store one or more instructions.
  • the processor when the one or more instructions are executed, includes a plurality of thumbnails on the flexible display of the electronic device operating in a second form based on the motion information.
  • a user interface is displayed, and when a touch input to at least one of the plurality of thumbnails is detected, at least one application corresponding to the at least one thumbnail in which the touch input is sensed is identified, and transmitted from the stylus pen
  • the identified at least one application may be executed, and an execution screen of the executed at least one application may be configured to be displayed on the flexible display.
  • the processor when the one or more instructions are executed, displays the plurality of displaying the first user interface including icons of , displaying a second user interface including a plurality of thumbnails in a second display area, and touching at least one of the plurality of icons or the plurality of thumbnails Identifies at least one application based on an input, executes the identified at least one application in response to the second signal transmitted from the stylus pen, and displays an execution screen of the executed at least one application in the flexible It can be set to be displayed on the display.
  • the memory further stores correlations of the plurality of applications with the stylus pen, and wherein the one or more instructions, when executed, cause the processor to: An area in which the execution screen of the at least one application is to be displayed may be determined and displayed as at least one of the first display area and the second display area based on the correlation.
  • the processor when the processor detects a drag input to the execution screen of the one or more applications displayed on the first display area when the one or more instructions are executed, It may be set to display the execution screen by moving it to at least a part of the second display area.
  • the memory further stores data related to execution information of a plurality of applications
  • the plurality of icons include a first icon and a second icon
  • the one or more instructions are the processor generates a weight based on the data associated with execution information of a first application corresponding to the first icon and a second application corresponding to the second icon, and the first determined based on the weight
  • a user interface may be set to be displayed on the flexible display.
  • the processor executes the first application and the second application when a touch input to the first icon and the second icon is sensed when the one or more instructions are executed and display the first application execution screen and the second application execution screen determined based on the weight on the flexible display.
  • the motion information may include at least one of information related to a number of shaking of the stylus pen, a shaking speed, and a shaking direction.
  • the processor when the one or more instructions are executed, performs at least one of an area size, area arrangement, or display of the plurality of icons and the execution screen based on the motion information. may be set to be determined and displayed on the flexible display.
  • the first signal corresponds to a control signal transmitted from the stylus pen when an external input is sensed to a button included in the stylus pen, and the second signal is included in the stylus pen.
  • the external input to the button When the external input to the button is no longer sensed, it may correspond to a control signal transmitted from the stylus pen.
  • a method for an electronic device to display a user interface includes receiving a first signal including motion information of the stylus pen from a stylus pen, and based on the motion information, a first form displaying a first user interface including a plurality of icons on a flexible display of the electronic device operating as In response to an operation of identifying at least one application corresponding to the icon, and a second signal transmitted from the stylus pen, the identified at least one application is executed, and an execution screen of the executed at least one application is displayed. It may include an operation of displaying on the flexible display.
  • the method may further include executing one application and displaying an execution screen of the at least one executed application on the flexible display.
  • the first user interface including the plurality of icons is displayed on a first display area of the flexible displays of the electronic device operating in a third form, and a second Displaying a second user interface including a plurality of thumbnails on a display area, identifying at least one application based on a touch input for at least one of the plurality of icons or the plurality of thumbnails, and the The method may further include executing the identified at least one application in response to the second signal transmitted from the stylus pen, and displaying an execution screen of the executed at least one application on the flexible display.
  • the operation of displaying the execution screen of the at least one executed application on the flexible display may include storing an area to display the execution screen of the at least one application identified based on the touch input in the memory.
  • the method may further include determining and displaying at least one of the first display area and the second display area based on the correlation.
  • the execution screen is displayed as the The method further includes moving and displaying at least a part of the second display area.
  • the plurality of icons includes a first icon and a second icon
  • the operation of displaying a first user interface including the plurality of icons on the flexible display may include: generating a weight based on the data associated with execution information of a first application and a second application corresponding to the second icon, and displaying the first user interface determined based on the weight on the flexible display may further include.
  • the operation of executing the first application and the second application and the weight may further include displaying, on the flexible display, the first application execution screen and the second application execution screen determined based on .
  • the flexible by determining at least one of a size, an area arrangement, or whether to display the plurality of icons and the execution screen based on the motion information.
  • An operation of displaying on a display may be further included.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif électronique comprenant un écran souple, un circuit de communication sans fil, un processeur et une mémoire est divulgué. Le processeur peut recevoir un premier signal comprenant des informations de mouvement concernant un stylet à partir du stylet, afficher une première interface utilisateur, comprenant une pluralité d'icônes, sur l'écran souple du dispositif électronique fonctionnant comme un premier type sur la base des informations de mouvement, identifier une application correspondant à l'icône par laquelle une entrée tactile est détectée, si l'entrée tactile pour au moins l'une parmi la pluralité d'icônes est détectée, exécuter l'application identifiée en réponse à un second signal transmis par le stylet, et afficher un écran d'exécution de l'application exécutée sur l'écran souple. L'invention concerne également divers autres modes de réalisation identifiés dans la description.
PCT/KR2021/012865 2020-09-29 2021-09-17 Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge Ceased WO2022071686A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0127193 2020-09-29
KR1020200127193A KR20220043600A (ko) 2020-09-29 2020-09-29 유저 인터페이스를 표시하는 방법 및 이를 지원하는 전자 장치

Publications (1)

Publication Number Publication Date
WO2022071686A1 true WO2022071686A1 (fr) 2022-04-07

Family

ID=80950754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/012865 Ceased WO2022071686A1 (fr) 2020-09-29 2021-09-17 Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge

Country Status (2)

Country Link
KR (1) KR20220043600A (fr)
WO (1) WO2022071686A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230022740A (ko) 2021-08-09 2023-02-16 삼성전자주식회사 전자 장치 및 객체의 움직임 동작을 개선하는 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140026711A (ko) * 2012-08-23 2014-03-06 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20140096752A (ko) * 2013-01-29 2014-08-06 엘지전자 주식회사 이동 단말기 및 이동 단말기 제어방법
KR20140141089A (ko) * 2013-05-31 2014-12-10 삼성전자주식회사 사용자 입력에 응답하여 어플리케이션을 실행하는 전자 장치
KR20160148959A (ko) * 2015-06-17 2016-12-27 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR102083918B1 (ko) * 2012-10-10 2020-03-04 삼성전자주식회사 멀티 디스플레이 장치 및 그 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140026711A (ko) * 2012-08-23 2014-03-06 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR102083918B1 (ko) * 2012-10-10 2020-03-04 삼성전자주식회사 멀티 디스플레이 장치 및 그 제어 방법
KR20140096752A (ko) * 2013-01-29 2014-08-06 엘지전자 주식회사 이동 단말기 및 이동 단말기 제어방법
KR20140141089A (ko) * 2013-05-31 2014-12-10 삼성전자주식회사 사용자 입력에 응답하여 어플리케이션을 실행하는 전자 장치
KR20160148959A (ko) * 2015-06-17 2016-12-27 엘지전자 주식회사 이동 단말기 및 그 제어방법

Also Published As

Publication number Publication date
KR20220043600A (ko) 2022-04-05

Similar Documents

Publication Publication Date Title
WO2022131549A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2022154423A1 (fr) Appareil électronique et procédé de traitement de saisie à partir d'un stylet dans un appareil électronique
WO2022164200A1 (fr) Dispositif électronique, et procédé de reconnaissance de chiffres basé sur une analyse de traits dans un dispositif électronique
WO2022085940A1 (fr) Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique
WO2024063380A1 (fr) Dispositif électronique et procédé de commande d'écran affiché sur un dispositif d'affichage souple
WO2023054948A1 (fr) Dispositif électronique comprenant un écran variable et son procédé de fonctionnement
WO2023063681A1 (fr) Dispositif électronique doté d'un dispositif d'affichage flexible, et procédé de commande de région d'entrée pour étendre et rétrécir un affichage dans un dispositif électronique doté d'un dispositif d'affichage flexible
WO2023018173A1 (fr) Dispositif électronique et procédé permettant de faire fonctionner un stylo électronique
WO2022149954A1 (fr) Dispositif électronique ayant un écran souple et procédé de fourniture d'un panneau de commande en fonction d'un changement de mode de celui-ci
WO2022114885A1 (fr) Procédé de commande d'un dispositif électronique à l'aide d'un stylet, et dispositif électronique destiné à recevoir une entrée à partir d'un stylet en utilisant le procédé
WO2022119276A1 (fr) Dispositif électronique d'affichage souple et procédé de fonctionnement associé
WO2022071686A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge
WO2022031048A1 (fr) Dispositif électronique et procédé d'affichage d'un pointeur de stylo électronique associé
WO2022119055A1 (fr) Dispositif électronique doté d'un dispositif d'affichage pliable et procédé de commande associé
WO2023149782A1 (fr) Dispositif électronique et procédé de fourniture d'une fonction haptique
WO2023013904A1 (fr) Procédé et dispositif de commande d'écran
WO2023008854A1 (fr) Dispositif électronique comprenant un capteur optique intégré dans une unité d'affichage
WO2022086071A1 (fr) Dispositif électronique pour commander le fonctionnement d'un dispositif de type stylo électronique, procédé de fonctionnement dans un dispositif électronique, et support de stockage non transitoire
WO2022149695A1 (fr) Dispositif électronique comprenant un double dispositif et dispositif de connexion reliant ce double dispositif
WO2022097996A1 (fr) Appareil électronique comprenant un écran souple
WO2022080825A1 (fr) Procédé d'affichage d'écran de dispositif pliable, et dispositif associé
WO2021246638A1 (fr) Procédé de capture d'image et dispositif électronique associé
WO2024181683A1 (fr) Dispositif électronique comportant un dispositif d'affichage souple et procédé de commande de dispositif d'affichage
WO2024014686A1 (fr) Dispositif électronique comprenant un écran comprenant un circuit tactile qui traite un contact d'un objet externe
WO2022124675A1 (fr) Procédé, dispositif électronique et support de stockage permettant de commander un capteur optique sur la base d'informations de flexion à l'aide d'un dispositif d'affichage étirable

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21875961

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21875961

Country of ref document: EP

Kind code of ref document: A1