WO2023182654A1 - Dispositif électronique, et procédé permettant d'identifier une entrée d'utilisateur - Google Patents

Dispositif électronique, et procédé permettant d'identifier une entrée d'utilisateur Download PDF

Info

Publication number
WO2023182654A1
WO2023182654A1 PCT/KR2023/002036 KR2023002036W WO2023182654A1 WO 2023182654 A1 WO2023182654 A1 WO 2023182654A1 KR 2023002036 W KR2023002036 W KR 2023002036W WO 2023182654 A1 WO2023182654 A1 WO 2023182654A1
Authority
WO
WIPO (PCT)
Prior art keywords
user input
processor
housing
touch
display
Prior art date
Application number
PCT/KR2023/002036
Other languages
English (en)
Korean (ko)
Inventor
최양수
김숙동
박지혜
이중협
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220071778A external-priority patent/KR20230137204A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2023182654A1 publication Critical patent/WO2023182654A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the flexible display included in the electronic device may be inserted into or extracted from the housing of the electronic device in order to change the size of the display area of the flexible display.
  • a method for changing the size of the display area of a flexible display may be required.
  • an electronic device includes a physical button to change the size of the display area of a flexible display
  • a method for changing the display area of the flexible display may be required based on user input through the display.
  • the electronic device is disposed on a first housing, a second housing slidably coupled to the first housing, a surface formed by the first housing and the second housing, and the second housing.
  • a flexible display insertable into a housing or extractable from the first housing, a driving device for driving the second housing to slide in or out, a memory, and the flexible At least one processor operatively connected to a display, a driving device, and a memory, wherein the at least one processor receives a user input through the flexible display and determines whether the user input corresponds to a reference user input.
  • the electronic device is disposed on a first housing, a second housing slidably coupled to the first housing, a surface formed by the first housing and the second housing, and the second housing.
  • a flexible display insertable into a housing or extractable from the first housing, a memory, and at least one processor operably connected to the flexible display and the memory,
  • One processor displays a screen for setting a reference user input for changing the state of the electronic device using the flexible display, and performs at least one first touch input through a touch input field in the screen.
  • the at least one In response to setting one first touch input as the reference user input, and receiving at least one second touch input after the at least one first touch input is set as the reference user input, the at least one Identifying whether a second touch input corresponds to the reference user input, and in response to identifying that the at least one second touch input corresponds to the reference user input, may be configured to change the state of the electronic device.
  • the electronic device may set a reference user input.
  • the electronic device may change the state of the electronic device based on identifying a user input that corresponds to a set reference user input.
  • the electronic device may change its state based on a standard user input set for each individual.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to one embodiment.
  • FIG. 2A shows a front view of a first state of an electronic device, according to one embodiment.
  • FIG. 2B shows a rear view of a first state of an electronic device, according to one embodiment.
  • FIG. 3A shows a front view of a second state of an electronic device, according to one embodiment.
  • FIG. 3B shows a rear view of a second state of an electronic device, according to one embodiment.
  • Figure 4 shows an exploded perspective view of an electronic device according to one embodiment.
  • Figure 5 shows a simplified block diagram of an electronic device, according to one embodiment.
  • FIG. 6 illustrates an example in which a processor of an electronic device performs an operation by referring to a database in a memory, according to an embodiment.
  • Figure 7 shows an example of capacitance values identified through a touch IC according to one embodiment.
  • Figure 8 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • Figure 9 shows an example of an operation of an electronic device according to an embodiment.
  • Figure 10 shows an example of an operation of an electronic device according to an embodiment.
  • 11A and 11B show examples of operations of an electronic device according to an embodiment.
  • Figure 12 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • Figure 13 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • Figure 14 shows an example of an operation of an electronic device according to an embodiment.
  • 15A and 15B show examples of operations of an electronic device according to an embodiment.
  • 16A and 16B show examples of operations of an electronic device according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to one embodiment.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • a mmWave antenna module includes: a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a first side e.g., bottom side
  • a designated high frequency band e.g., mmWave band
  • a plurality of antennas e.g., array antennas
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • FIG. 2A shows a front view of a first state of an electronic device, according to one embodiment.
  • FIG. 2B shows a rear view of a first state of an electronic device, according to one embodiment.
  • FIG. 3A shows a front view of a second state of an electronic device, according to one embodiment.
  • FIG. 3B shows a rear view of a second state of an electronic device, according to one embodiment.
  • the electronic device 200 (e.g., the electronic device 101 of FIG. 1) according to an embodiment includes a first housing 210 and a second housing. It may include (220). According to one embodiment, the second housing 220 may move from the first housing 210 in a designated direction, for example, a first direction (+y direction). For example, the second housing 220 may slide and move a specified distance from the first housing 210 in the first direction (+y direction). According to one embodiment, the second housing 220 may reciprocate within a specified distance from a portion of the first housing 210 in the first direction (+y direction).
  • the electronic device is in a state in which the second housing 220 slides in the direction in which the first housing 210 faces, for example, in the second direction (-y direction) opposite to the first direction (+y direction). It can be defined as the first state (e.g., contracted state, or slide-in state) of (200). In one embodiment, the first state of the electronic device 200 may be defined as a state in which the second portion 230b of the display 230 is not visually exposed to the outside. The first state of the electronic device 200 may mean a state in which the second part 230b of the display 230 is located inside the second housing 220.
  • the state in which the second housing 220 slides from the first housing 210 in the first direction (+y direction) is defined as the second state (e.g., extended state, or slide state) of the electronic device 200. It can be defined as a slide-out state.
  • the second state of the electronic device 200 may be defined as a state in which the second portion 230b of the display 230 is visually exposed to the outside.
  • the second state of the electronic device 200 may mean a state in which the second part 230b of the display 230 is located outside the second housing 220.
  • the second housing 220 moves from the first housing 210 in the first direction (+y direction) to display at least a portion of the second housing 220 and/or the second housing 230.
  • the portion 230b may be drawn out and form a drawn out length d1 corresponding to the moving distance.
  • the second housing 220 may reciprocate within a specified distance d2.
  • the drawn-out length d1 may have a size ranging from about 0 to the specified length distance d2.
  • a first state may be referred to as a first shape and a second state may be referred to as a second shape.
  • the first shape may include a normal state, a collapsed state, or a closed state
  • the second shape may include an open state.
  • the electronic device 400 may form a third state (eg, an intermediate state) that is a state between the first state and the second state.
  • the third state may be referred to as the third shape
  • the third shape may include a free stop state.
  • the ratio of height (h) and width (w) may be 4.5 to 3.
  • the ratio of the height (h) and the width (w) may be 21 to 9.
  • the ratio of height (h) and width (w) may be 16 to 9.
  • the electronic device 200 When switching between the second state and/or the first state, the electronic device 200 according to one embodiment is manually switched by a user's operation, or is switched between the first housing 210 and the second housing 220. It can be switched automatically through a driving module (not shown) placed inside.
  • the driving module may trigger an operation based on a user input.
  • user input for triggering the operation of the driving module may include touch input, force touch input, and/or gesture input through the display 230.
  • the user input for triggering the operation of the driving module may include a voice input (voice input), or an input of a physical button exposed to the outside of the first housing 210 or the second housing 220. You can.
  • the driving module may be driven in a semi-automatic manner in which an operation is triggered when a manual operation by an external force of the user is detected.
  • the electronic device 200 is named a “slidable electronic device” since the second housing 220 is designed to slide, or at least a portion of the display 230 is connected to the second housing. As it is designed to be rolled up inside the second housing 220 (or the first housing 210) based on the slide movement of the device 220, it may be called a “rollable electronic device.”
  • the second housing 220 may be coupled to the first housing 210 so that it can at least partially slide.
  • the combined form of the first housing 210 and the second housing 220 is not limited to the form and combination shown in FIGS. 2A, 2B, 3A, and 3B, and other shapes or parts may be used. It may also be implemented by combination and/or combination of.
  • the first housing 210 of the electronic device 200 includes a book cover 216 surrounding the inner space of the first housing 210 and a rear plate 211 surrounding the rear of the book cover 216. ) may include.
  • the second housing 220 of the electronic device 200 may include a slide cover 221 surrounding the internal space of the second housing 220.
  • the second housing 220 or the slide cover 221 is not inserted into the first housing 210 and is always visible from the outside in the second state and the first state of the electronic device 200. It may include a first cover area 220a of the second housing 220 that is exposed, and a second cover area 220b that is inserted or extracted into the internal space of the first housing 210. According to one embodiment, the second cover area 220b of the second housing 220 may not be visually exposed to the outside in the first state, but may be visually exposed to the outside in the second state.
  • the display 230 may be arranged to be visually exposed from the outside through the front direction (eg, +z direction) of each of the first housing 210 and the second housing 220.
  • the display area of the display 230 may include a first part 230a and a second part 230b.
  • the first portion 230a of the display 230 may be a display area that is permanently visually exposed from the outside, regardless of whether the electronic device 200 is in the second state or the first state. there is.
  • the first portion 230a of the display 230 may be fixed without movement regardless of the slide movement of the second housing 220.
  • the second part 230b of the display 230 is a display area extending from one end of the first part 230a, and is linked to the slide movement of the second housing 220, It may be introduced into the internal space of the second housing 220, or may be drawn out from the internal space of the second housing 220.
  • a hole (not shown) through which the second part 230b of the display 230 is extracted or retracted may be arranged adjacent to the side of the second housing 220 in the +y direction.
  • the second portion 230b of the display 230 may be pulled out or retracted from a boundary portion of the second housing 220 in the +y direction.
  • the second portion 230b of the display 230 may be pulled out from the inner space of the second housing 220 and visually exposed from the outside.
  • the second portion 230b of the display 230 in the first state, may be inserted into the internal space of the second housing 220 and not visually exposed from the outside.
  • the display 230 may include a flexible display.
  • the second part 230b of the display 230 may be rolled into the inner space of the second housing 220 in the first state and may be introduced in a bent state.
  • the display area of the display 230 may be such that the first part 230a and the second part 230b of the display 230 are visually exposed from the outside.
  • the electronic device 200 may include a camera module 261 and/or a flash 262.
  • the camera module 261 and/or the flash 262 may be obscured by the rear plate 211 of the first housing 210.
  • the camera module 261 and/or the flash 262 may be exposed to the outside through an opening formed in the slide cover 221.
  • the electronic device 200 provides a structure in which the camera module 261 and/or the flash 262 are exposed to the outside in the first state. can do.
  • the camera module 261 is shown as including one camera, the camera module 261 may include a plurality of cameras.
  • the camera module 261 may include a wide-angle camera, an ultra-wide-angle camera, a telephoto camera, a proximity camera, and/or a depth camera.
  • the camera module 261 may include one or more lenses, an image sensor, and/or an image signal processor.
  • the flash 262 may include, for example, a light emitting diode or a xenon lamp.
  • the electronic device 200 may include a sensor module (not shown) and/or a camera module (not shown) disposed below the display 230 (e.g., in the -z direction from the display 230). You can.
  • the sensor module may detect the external environment based on information (eg, light) received through the display 230.
  • the sensor module includes a receiver, proximity sensor, ultrasonic sensor, gesture sensor, gyro sensor, barometric pressure sensor, magnetic sensor, acceleration sensor, grip sensor, color sensor, IR (infrared) sensor, biometric sensor, and temperature sensor. It may include at least one of a sensor, a humidity sensor, a motor encoder, or an indicator.
  • the electronic device 200 may detect the draw-out length (eg, length A) using a sensor module.
  • the electronic device 200 may generate retrieval information about the degree of retrieval detected by the sensor.
  • the electronic device 200 may detect and/or confirm the extent to which the second housing 220 has been withdrawn using the withdrawal information.
  • the pull-out information may include information about the pull-out length of the second housing 220.
  • the electronic device 200 includes a housing (e.g., a first housing 210 and a second housing 220), is supported by the housings 210 and 220, and includes the housings 210 and 220.
  • a housing e.g., a first housing 210 and a second housing 220
  • the electronic device 200 may include a display 230 in which the area of the display area is adjusted in response to movement of at least a portion of the display 230 in the first direction (+y direction).
  • the display area of the display 230 includes a first portion 230a that is fixedly exposed to the outside regardless of whether at least a portion of the second housing 220 moves in the first direction (+y direction), and the first portion 230a
  • the second housing extends from one end of the portion 230a and is exposed to the outside by being drawn out from the inner space of the first housing 210 in response to movement of at least a portion of the second housing 220 in the first direction (+y direction). It may include two parts (230b).
  • FIG. 2A to 3B illustrate an example in which the second housing 220 of the electronic device 200 moves in the first direction (+y direction), but the present invention is not limited thereto.
  • the second housing 220 of the electronic device 200 may be moved in a different direction (+x direction) distinct from the first direction.
  • the height (h) may be fixed and the width (w) may be changed.
  • Figure 4 shows an exploded perspective view of an electronic device according to one embodiment.
  • the electronic device 200 may include a first housing 210, a second housing 220, and a display 230.
  • the first housing 210 may include a rear plate 211, a book cover 216, and/or a first support member 410.
  • the book cover 216 may form part of the exterior of the electronic device 200, and the rear plate 211 may form part of the rear of the electronic device 200.
  • the rear plate 211 may be disposed in a direction facing one side of the book cover 216 (eg, -z direction).
  • the book cover 216 may form the side of the electronic device 200 and provide a surface on which the rear plate 211 will be seated.
  • the first support member 410 may be disposed inside the book cover 216.
  • the first support member 410 may be surrounded by the book cover 216.
  • the first support member 410 may be disposed between the display 230 and the book cover 216.
  • the book cover 216 may surround the space between the rear plate 211 and the first portion 230a of the display 230.
  • the first support member 410 may extend from the book cover 216 within the space.
  • the first support member 410 may support or accommodate other components included in the electronic device 200.
  • the first portion 230a of the display 230 may be disposed on one side of the first support member 410 facing one direction (e.g., +z direction), and the first portion 230a of the display 230 Portion 230a may be supported by first support member 410 .
  • the first printed circuit board 431 and the driver assembly 440 may be disposed on the other side of the first support member 410 facing in a direction opposite to the one direction (e.g., -z direction). there is.
  • the first printed circuit board 431 and the drive assembly 440 may each be seated in a recess defined by the first support member 420 and the book cover 216 or a hole defined by the back plate 211. there is.
  • the first printed circuit board 431 may be disposed in the space formed between the book cover 216 and the first support member 410.
  • the wireless charging module 492 may be placed on the outer surface of the book cover 216.
  • the wireless charging module 492 may be placed between the book cover 216 and the back plate 211.
  • the wireless charging module 492 may be connected to the first printed circuit board 431.
  • the first printed circuit board 431 may electrically connect the wireless charging module 492 and the battery 490.
  • the first printed circuit board 431 may further include a power management module that distributes power between the wireless charging module 492 and the battery 490.
  • the first printed circuit board 431 has been described as a single printed circuit board, but it is not limited to this and may be composed of a plurality of printed circuit boards.
  • the first printed circuit board 431 and the driver assembly 440 may be respectively coupled to the first support member 410.
  • the printed circuit board 432 may be fixed to the first support member 410 through a coupling member such as a screw.
  • the driving unit (eg, motor) of the driving unit assembly 440 may be fixed to the first support member 410 through a coupling member.
  • the fixing method or fastening method is not limited to the above-described method.
  • the book cover 216 may be disposed between the first support member 410 and the rear plate 211. According to one embodiment, the book cover 216 may be disposed on the first support member 410. For example, the book cover 216 may be disposed on a side of the first support member 410 facing the -z direction.
  • the book cover 416 may at least partially overlap the first printed circuit board 431 when viewed in the z-axis direction.
  • the book cover 416 may cover at least a portion of the first printed circuit board 431 .
  • the book cover 416 can protect the printed circuit board 431 from physical shock.
  • the book cover 416 may be coupled to the first support member 410 through a coupling member (eg, screw).
  • the first support member 410 may include a side wall capable of accommodating the battery 490.
  • the first support member 410 includes a seating portion formed by a side wall, and the battery 490 can be accommodated in the seating portion.
  • the battery cover 491 may surround the battery 490 and, together with the side wall of the first support member 410, may form a space in which the battery 490 will be seated.
  • the second housing 220 may include a slide cover 221, a slide plate 222, and/or a second support member 420.
  • the slide cover 221 may form another part of the exterior of the electronic device 200.
  • the book cover 216 may form part of the exterior of the electronic device 200
  • the slide cover 221 may form part of another exterior of the electronic device 200.
  • the second housing 220 may be slidably coupled to the first housing 210.
  • the second cover area 220b of the second housing 220 may be within the first housing 210 .
  • the first cover area 210b of the second housing 220 may move along the first direction (+y direction) and be exposed to the outside of the first housing 210.
  • the electronic device may include guide members 481 and 482. .
  • the guide members 481 and 482 may guide the movement of the second housing 220 and support the second housing 220 .
  • the guide members 481 and 482 may prevent the second housing 220 from being tilted in the -z-axis direction with respect to the first housing 210 when the second housing 220 is pulled out.
  • the slide plate 222 may be disposed between the slide cover 221 and the second support member 420.
  • the second support member 420 may include an electrical object 430 and a second printed circuit board 432 .
  • the electrical object 430 may include a camera module, etc.
  • the slide plate 222 and the slide cover 221 may provide a structure (eg, an opening) for exposing the camera module or sensor to the outside.
  • a support bar may support the second portion 230b of the display 230 when the display 230 is expanded to the second state.
  • the support bar may be formed by combining a plurality of bars to have a shape corresponding to the shape of the second part 230b of the display 230.
  • the support bar in a first state in which the second portion 230b of the display 230 is wound within the second housing 220, the support bar is connected to the second portion 230b of the display 230 and the second portion 230b of the display 230. Together, they may be wound within the second housing 220 .
  • the support bar can move along a guide rail formed on the inner surface of the slide cover 221. As the support bar 226 moves inside the slide cover 221, the second part 230b of the display 230 can be wound inside the slide cover 221.
  • the second part 230b of the display 230 may be pulled out from the inner space of the second housing 220 in the second state.
  • the second portion 230b of the display 230 may be drawn out from the inner space of the slide cover 221.
  • the second support member 420 may extend inward from the slide cover 221.
  • the second support member 420 may be surrounded by the slide cover 221 and the second portion 230b of the display 230.
  • the second support member 420 may move in the first direction (+y direction) when the display 230 is expanded.
  • the second support member 420 in the first state, is positioned below the first portion 230a of the display 230 or the first support member 410 (e.g., with respect to the display 230 - z direction).
  • the second support member 420 may move in the first direction (+y direction) and be disposed below the second portion 230b of the display 230.
  • the second support member 420 may surround the space between the slide cover 221 and the display 230.
  • the second support member 420 may extend from the slide cover 221 within the space.
  • the second support member 420 may support or accommodate other components included in the electronic device 200.
  • the second support member 420 may support the support bar 226.
  • the support bar 226 moves along the inner surface of the slide cover 221 and is supported by the second support member 420 to maintain the shape of the second portion 230b of the display 230.
  • a camera module 261 and a flash 262 may be disposed on the second support member 420.
  • the camera module 261 and the flash 262 may each be seated in a recess defined by the second support member 420 or a hole defined by the slide cover 221.
  • the second support member 420 may be coupled to the driving unit of the driving unit assembly 440 or the case 470 surrounding the driving unit.
  • the case 470 or the driving unit may be fixed to the second support member 420 through a coupling member such as a screw.
  • the fixing method or fastening method is not limited to the above-described method.
  • the electronic device may change the size of the display based on the movement of the second housing.
  • the electronic device may further include an external physical button. Based on input to an external physical button, the second housing of the electronic device may be moved. In this case, additional mounting space for the external physical button may be required.
  • the electronic device may display a software button for moving the second housing. Software buttons may always be displayed on some portion of the screen of an electronic device.
  • the electronic device may move the second housing based on simultaneously identifying a plurality of touch inputs. In this case, it may be difficult for the user of the electronic device to move the second housing with one hand.
  • the electronic device may receive a user input for moving the second housing from the user.
  • the electronic device may set the received user input as a reference user input.
  • the electronic device may move the second housing based on identifying a user input corresponding to the reference user input.
  • the electronic device can change the size of the display area of the display based on identifying another user input that corresponds to the reference user input.
  • Figure 5 is a simplified block diagram of an electronic device, according to an embodiment.
  • the electronic device 500 may include some or all of the components of the electronic device 101 shown in FIG. 1 or the electronic device 200 shown in FIGS. 2A to 4 .
  • the electronic device 500 may correspond to the electronic device 101 shown in FIG. 1 or the electronic device 200 shown in FIGS. 2A to 4.
  • the electronic device 500 may include a processor 510, a display 520, a sensor 530, a driving device 540, and/or a memory 550.
  • the electronic device 500 may include at least one of a processor 510, a display 520, a sensor 530, a driving device 540, and a memory 550.
  • the processor 510, display 520, sensor 530, driving device 540, and memory 550 may be omitted depending on the embodiment.
  • the processor 510 is operatively or operably coupled with or connected to the display 520, the sensor 530, the driving device 540, and the memory 550. (connect with) can be done.
  • the processor 510 may control the display 520, sensor 530, driving device 540, and memory 550.
  • the display 520, sensor 530, driving device 540, and memory 550 may be controlled by the processor 510.
  • the processor 510 may be comprised of at least one processor.
  • Processor 510 may include at least one processor.
  • processor 510 may correspond to processor 120 of FIG. 1 .
  • the processor 510 may include hardware components for processing data based on one or more instructions.
  • Hardware components for processing data may include, for example, an Arithmetic and Logic Unit (ALU), a Field Programmable Gate Array (FPGA), and/or a Central Processing Unit (CPU).
  • ALU Arithmetic and Logic Unit
  • FPGA Field Programmable Gate Array
  • CPU Central Processing Unit
  • the processor 510 may include a display 520.
  • the display 520 may be a rollable display or a flexible display.
  • the display 520 may be configured so that at least a portion of the display 520 can be retracted into the housing of the electronic device 500 (e.g., the second housing 220) or withdrawn from the housing of the electronic device 500. It can be configured to make it possible.
  • display 520 may correspond to display module 160 of FIG. 1 .
  • display 520 may correspond to display 230 in FIGS. 2A-4.
  • the display 520 may include a touch panel (or touch screen) and a touch integrated circuit (IC).
  • the touch IC can identify capacitance values that change according to touch input to the touch panel. An example of capacitance values identified through the touch IC will be described later in FIG. 7.
  • the display 520 may be deformable.
  • the display 520 may provide different display areas depending on the state of the electronic device 500.
  • a display area in which the display 520 is exposed at its minimum size may be provided.
  • the display area In the first state, the display area may be referred to as the first display area.
  • a display area where the display 520 is exposed at its maximum size may be provided. In the second state, the display area may be referred to as a second display area.
  • the display area of the display 520 may be set to the smallest.
  • the first state may be a state that provides a viewable area with a minimum size.
  • the first state may be a state in which the display 520 provides a display area exposed at the minimum size.
  • the first state may be referred to as a reduced state of the display 520.
  • the display area of the display 520 may be set to the largest.
  • the second state may be a state that provides a viewable area with the maximum size.
  • the second state may be a state in which the display 520 provides a display area exposed at its maximum size.
  • the second state may be referred to as an extended state of the display 520.
  • the electronic device 500 may be set to a third state that is an intermediate state between the first state and the second state. Based on the state of the electronic device 500 being in the third state, the display 520 may provide a third display area. The size of the third display area may be set to be larger than the size of the first display area and smaller than the size of the second display area. In the third state, the display area of the display 520 may be set to be larger than the size of the first display area and smaller than the size of the second display area.
  • the above-described first to third states are for convenience of explanation, and the above-described first to third states may be changed depending on the embodiment.
  • the above-described first to third display areas are for convenience of explanation, and the above-described first to third display areas may be changed depending on the embodiment.
  • the electronic device 500 may include a sensor 530.
  • the sensor 530 can be used to obtain various information.
  • the sensor 530 may be used to obtain information about the electronic device 500.
  • sensor 530 may include a sliding sensor.
  • the sliding sensor is one of the first housing (e.g., the first housing 210 in FIG. 2A) and the second housing (e.g., the second housing 220 in FIG. 2A) included in the electronic device 500. 2 Information about the state in which the housing has been moved can be obtained.
  • sensor 530 may be comprised of at least one sensor.
  • Sensor 530 may include at least one sensor.
  • sensor 530 may correspond to sensor module 176 in FIG. 1 .
  • the electronic device 500 may include a driving device 540.
  • the drive device 540 e.g., the drive assembly 440 of FIG. 4
  • the second housing of the electronic device 500 e.g., the second housing 220 of FIG. 2A
  • the driving device 540 can be used to drive the second housing to slide in or out.
  • drive device 540 may include a motor, rack gear, and pinion gear.
  • the shaft of the motor may be coupled with a pinion gear (eg, pinion gear 442 in FIG. 4).
  • the processor 510 may move a rack gear (eg, rack gear 443 in FIG. 4) coupled to the first housing 210 based on rotating the pinion gear using a motor.
  • the electronic device 500 may include a memory 550.
  • memory 550 may correspond to memory 130 of FIG. 1 .
  • memory 550 may be a volatile memory unit or units.
  • memory 550 may be a non-volatile memory unit or units.
  • memory 550 may be another form of computer-readable medium, such as a magnetic or optical disk.
  • the memory 550 may store data obtained based on an operation performed by the processor 510 (eg, an algorithm execution operation).
  • the memory 550 may be used to store information about the user's touch input of the electronic device 500.
  • memory 550 may include a touch model database and/or a trigger pattern database.
  • the touch model database may include data about touch input input from the electronic device 500.
  • the touch model database can be used to store information about multiple user inputs.
  • a trigger pattern database can be used to store information about user input (or patterns) defined by the user.
  • the electronic device 500 may further include various components in addition to those shown in FIG. 5 .
  • the electronic device 500 may further include a rechargeable battery and/or a communication circuit for performing wired communication and/or wireless communication with an external device.
  • FIG. 6 illustrates an example in which a processor of an electronic device performs an operation by referring to a database in a memory, according to an embodiment.
  • the processor 510 may include a touch operation determiner 511, a touch recognition model modifier 512, a trigger pattern storage 513, and/or a trigger pattern similarity determiner 514. You can.
  • the memory 550 may include a touch model database 551 and/or a trigger pattern database 552.
  • the memory 550 may further include a temporary database for temporarily storing data about at least one touch input.
  • a temporary database can be used to temporarily store data generated as touch input moves.
  • the temporary database may be used to store capacitance values obtained according to a predefined period while a user input consisting of at least one touch input is maintained.
  • the touch operation determiner 511 may identify information about at least one touch input. For example, based on the at least one touch input, the touch operation determiner 511 may include information about the area at which the at least one touch input is performed, information about the speed at which the at least one touch input is performed, and at least one At least one piece of information about the direction in which the touch input is performed can be identified. For another example, the touch operation determiner 511 may identify information about a path drawn by at least one touch input.
  • the touch operation determiner 511 may identify information about at least one touch input based on information stored in the touch model database 551.
  • the touch model database 551 may be used to store information about a plurality of user inputs.
  • the touch model database 551 can be used to store information about operations performed according to various touch inputs. For example, information about the operation of the electronic device 500 (eg, screen switching operation) according to a horizontal swipe input may be stored in the touch model database 551. Information about the operation (eg, scroll operation) of the electronic device 500 according to a vertical swipe input may be stored in the touch model database 551.
  • the touch operation determiner 511 may identify information about at least one touch input based on an artificial intelligence model. For example, the touch operation determiner 511 may identify information about at least one touch input by inputting data about at least one touch input into an artificial intelligence model indicated by a plurality of parameters.
  • the touch recognition model corrector 512 may update the model (or artificial intelligence model) of the touch operation determiner 511.
  • the touch recognition model modifier 512 determines the touch action based on whether the performance of the touch action determiner 511 is degraded or the action of the electronic device 500 according to the touch input action is newly set.
  • the model of (511) can be updated.
  • the trigger pattern storage 513 may be used to store an operation of the electronic device 500 performed according to at least one touch input set by the user.
  • the trigger pattern storage 513 may set a reference user input for moving the second housing.
  • the trigger pattern storage 513 may set at least one touch input received from the user as a reference user input.
  • the trigger pattern storage 513 may store data about at least one received touch input in the trigger pattern database 552.
  • the trigger pattern similarity determiner 514 may receive (or identify) at least one touch input after a reference user input is set. The trigger pattern similarity determiner 514 may identify whether at least one received touch input corresponds to a reference user input based on the trigger pattern database 552.
  • Figure 7 shows an example of capacitance values identified through a touch IC according to one embodiment.
  • the touch panel 710 may be composed of a plurality of layers.
  • the first layer of the touch panel 710 may include a drive electrode 711.
  • the second layer of the touch panel 710 may include a dielectric 712.
  • the third layer of the touch panel 710 may include a sense electrode 713.
  • the touch IC may identify capacitance values that change in a plurality of areas (eg, first area 714) where the driving electrode 711 and the sensing electrode 713 intersect.
  • the touch IC can identify the capacitance value between the driving electrode 711 and the sensing electrode 713.
  • the touch IC can identify the capacitance value between the driving electrode 711 and the sensing electrode 713 that changes based on the touch input.
  • the touch IC may generate data 720 based on capacitance values identified in a plurality of areas. Capacitance values included in data 720 may mean the amount of change in capacitance values identified in each of the plurality of areas.
  • the touch IC may generate data 720 by removing (or noise filtering) noise associated with the capacitance values identified in each of the plurality of areas.
  • data 720 may be referred to as raw data.
  • the touch IC may transmit data 720 to the processor 510.
  • the processor 510 may receive data 720 from the touch IC.
  • the processor 510 may identify information about the user input (or pattern) based on the data 720.
  • the processor 510 may receive raw data 720 from the touch IC.
  • the processor 510 may identify the location where the touch input occurred based on the data 720.
  • the touch IC may identify the coordinate value where the touch input occurred based on the data 720.
  • the touch IC can identify the coordinate value where the touch input occurred by setting the data 720 as an input value of a predefined algorithm.
  • the touch IC may transmit information about the identified coordinate value to the processor 510.
  • the processor 510 may identify the location where the touch input occurred based on information about the touch-identified coordinate value.
  • Figure 8 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • the processor 510 may receive a user input.
  • processor 510 may receive user input through display 520 (eg, flexible display).
  • user input may consist of at least one touch input.
  • the processor 510 may identify touch inputs identified from the time the user input starts to the time the user input is released as at least one touch input.
  • the processor 510 may identify touch inputs according to a predefined period from the time the user input starts to the time the user input is released.
  • the processor 510 may identify the identified touch inputs as at least one touch input.
  • the processor 510 may identify at least one touch input by identifying user input according to a predefined cycle.
  • the processor 510 may identify user input according to a predefined period using a touch IC.
  • the processor 510 may receive data about at least one touch input (eg, data 720 in FIG. 7) from the touch IC.
  • the processor 510 may identify first data about capacitance values obtained through the display 520.
  • the processor 510 may identify first data about capacitance values obtained through the display 520 according to user input.
  • the capacitance values obtained through the display 520 may include capacitance values obtained from each of a plurality of areas divided within the display 520.
  • the processor 510 may identify capacitance values obtained according to a predefined period as first data while the user input is maintained.
  • the processor 510 may display a visual object representing a path drawn by the user input in response to a user input received through the display 520.
  • the processor 510 may also display a visual object representing a path drawn by user input. For example, a visual object representing a path drawn along at least one touch input constituting the user input may be displayed.
  • user input may be received within a predefined area within display 520.
  • the processor 510 may display a predefined area through the display 520.
  • the processor 510 may display elements representing a predefined area by overlapping them on the screen.
  • the predefined area can be set in various ways. As an example, the predefined area may change based on the state of the electronic device 500. As another example, the predefined area may be changed based on the size of the display area of the display 520. Depending on the embodiment, the location or size of the predefined area may be changed based on the state (or size of the display area) of the electronic device 500.
  • processor 510 may identify whether the user input corresponds to a reference user input.
  • the processor 510 may identify first data about capacitance values obtained through the display 520.
  • the processor 510 may identify whether the user input corresponds to the reference user input based on the first data. For example, the processor 510 may identify whether the user input corresponds to a reference user input by setting the first data as input data of a model (or artificial intelligence model) indicated by a plurality of parameters. .
  • the processor 510 based on the first data, provides information about the area at which at least one touch input is performed, information about the speed at which at least one touch input is performed, and/or at least one At least one piece of information about the direction in which a touch input is performed can be identified.
  • the reference user input may be set to be distinguished from a plurality of user inputs generated by use of the electronic device 500.
  • the reference user input may be set to be distinguished from the user input (eg, scroll input) for changing the screen of the electronic device 500.
  • the processor 510 may set the reference user input to be distinguished from a plurality of user inputs generated by use of the electronic device 500.
  • the processor 510 may identify whether a user input corresponding to a reference user input is received.
  • the processor 510 may identify a plurality of user inputs based on a touch pattern generated by use of the electronic device 500.
  • the processor 510 may identify a plurality of user inputs by learning touch patterns generated by use of the electronic device 500.
  • the processor 510 may identify the amount of change in capacitance values based on the touch pattern generated by use of the electronic device 500.
  • the processor 510 may identify a plurality of user inputs based on the amount of change in the identified capacitance values.
  • the processor 510 may identify whether the user input corresponds to a reference user input based on the first sensitivity.
  • Processor 510 may set a sensitivity to identify whether a user input corresponds to a reference user input. For example, the user input may not be the same as the reference user input.
  • Processor 510 may set sensitivity to compensate for differences between user input and reference user input. When the sensitivity is set low, the processor 510 may identify the user input as corresponding to the reference user input even if the similarity between the user input and the reference user input is small. When the sensitivity is set high, the processor 510 can identify the user input as corresponding to the reference user input only when the similarity between the user input and the reference user input is high.
  • processor 510 may identify a request to change the sensitivity to identify whether a user input corresponds to a reference user input. Based on the identified request, the processor 510 may change the sensitivity for identifying whether the user input corresponds to the reference user input from a first sensitivity to a second sensitivity. Depending on the embodiment, the processor 510 may set sensitivity differently depending on the type of application or various standard user inputs.
  • the processor 510 may change the size of the display area when the user input corresponds to the reference user input. For example, the processor 510 may change the size of the display area of the display 520 using the driving device 540 based on identifying that the user input corresponds to a reference user input.
  • the processor 510 may change the size of the display area of the display 520 to the first size using the driving device 540, based on the user input corresponding to the reference user input. .
  • the processor 510 may move the second housing (e.g., the second housing 220) using the driving device 540, based on the user input corresponding to the reference user input. there is.
  • the processor 510 can change the size of the display area of the display 520 to the first size by moving the second housing.
  • the processor 510 may perform various operations based on whether the user input corresponds to the reference user input. For example, the processor 510 may set the operation of the electronic device 500 according to a reference user input. The processor 510 may set a designated application to be executed according to a standard user input. The processor 510 may execute a designated application based on whether the user input corresponds to the reference user input.
  • the processor 510 may bypass (or refrain from) changing the size of the display area when the user input is different from the reference user input.
  • processor 510 may use drive device 540 to bypass (or refrain from changing the size of the display area of display 520) based on identifying that user input is different from a reference user input. )can do.
  • the reference user input can be set in various ways. For example, based on a reference user input (or a first reference user input), the size of the display area may be set to the first size. Based on another reference user input (or a second reference user input), the size of the display area may be set to the second size.
  • the processor 510 may identify whether the user input corresponds to another reference user input that is distinct from the reference user input. Based on identifying that the user input corresponds to another user input, the processor 510 uses the driving device 540 to change the size of the display area of the display 520 to a second size that is distinct from the first size. You can.
  • the processor 510 may bypass (or refrain from) changing the size of the display area using the driving device 540 based on identifying that the user input is different from other reference user inputs.
  • Figure 9 shows an example of an operation of an electronic device according to an embodiment.
  • processor 510 may identify that a touch has occurred on display 520 at location 911 .
  • the processor 510 may identify that the amount of change in capacitance significantly increases in area 912.
  • the processor 510 may identify that a touch has occurred on the display 520 based on identifying that the amount of change in capacitance greatly increases in the area 912 among the plurality of areas.
  • the amount of capacitance change in an area where no touch input occurs may maintain a value close to 0 (eg, less than 10).
  • the amount of capacitance change in the area where the touch occurred may increase. The harder the user presses the screen, the more the capacitance change in the area where the touch occurred may increase.
  • the touch input generated in state 910 may be referred to as the first touch input.
  • the processor 510 may identify data about capacitance values obtained through the display 520 based on the first touch input.
  • processor 510 may identify that a touch on display 520 moves from location 911 to location 921 .
  • the processor 510 may identify that the area in which the amount of change in capacitance increases changes from area 912 to area 922.
  • Processor 510 moves the touch on display 520 from location 911 to location 921 based on identifying that the area where the amount of change in capacitance increases changes from area 912 to area 922. can be identified.
  • the processor 510 may identify that the touch on the display 520 has moved from location 911 to location 921 while the touch on the display 520 is maintained.
  • a touch input generated in state 920 may be referred to as a second touch input.
  • the processor 510 may identify data on capacitance values obtained through the display 520 based on the second touch input.
  • processor 510 may identify that a touch on display 520 moves from location 921 to location 931. For example, the processor 510 may identify that the area in which the amount of change in capacitance increases changes from area 922 to area 932. Processor 510 moves the touch on display 520 from location 921 to location 931 based on identifying that the area where the amount of change in capacitance increases changes from area 922 to area 932. can be identified. The processor 510 may identify that the touch on the display 520 has moved from the location 921 to the location 931 while the touch on the display 520 is maintained. Processor 510 may identify that the touch on display 520 is released at location 931 .
  • the touch input generated in state 930 may be referred to as the third touch input.
  • the processor 510 may identify data on capacitance values obtained through the display 520 based on the third touch input.
  • processor 510 may identify at least one touch input 941.
  • the processor 510 may identify at least one touch input 941 as a user input.
  • at least one touch input 941 may include a first to a third touch input.
  • the first to third touch inputs are exemplary, and the processor 510 performs at least one touch input according to a predefined period while the user input is maintained from state 910 to state 930. can be identified.
  • the processor 510 may identify first data 942 about capacitance values for a plurality of areas according to a predefined period.
  • the processor 510 sets the first data 942 as an input value of a model (e.g., a first model) indicated by a plurality of parameters, thereby controlling the operation of the electronic device 500 corresponding to the user input. can be identified.
  • the processor 510 may perform the operation of the electronic device 500 based on user input. As an example, the processor 510 may change the size of the display area based on identifying that the user input corresponds to a reference user input.
  • the processor 510 may use the data 942 to learn a model (eg, a second model) indicated by a plurality of parameters.
  • the processor 510 may identify a plurality of user inputs generated by use of the electronic device 500.
  • the processor 510 may learn a model (eg, a second model) indicated by a plurality of parameters based on a plurality of third touch inputs corresponding to a plurality of user inputs. By learning the model, the processor 510 can identify information about individual touch patterns of the user of the electronic device 500.
  • Figure 10 shows an example of an operation of an electronic device according to an embodiment.
  • the processor 510 may identify whether a user input corresponds to a reference user input based on the first data. For example, the processor 510 may identify first data about capacitance values obtained through the display 520 according to user input. As an example. The processor 510 may identify the amount of change in capacitance values for a plurality of areas based on user input.
  • the processor 510 may set the first data as an input value of the model 1010.
  • Model 1010 may be indicated by a plurality of parameters.
  • Model 1010 may be used to identify information about whether a user input corresponds to a reference user input.
  • the model 1010 may be composed of multiple neural network layers. Each of the plurality of neural network layers has a plurality of weight values, and a neural network calculation can be performed through calculation between the calculation result of the previous layer and the plurality of weights.
  • the processor 510 may process the first data and then set the processed data as an input value of the model 1010. For example, the processor 510 may identify information about the touch location, information about the touch area, information about the touch direction, and/or information about the touch speed, based on the first data. The processor 510 may set at least some of the identified information as input values of the model 1010.
  • the processor 510 may identify information about whether the user input corresponds to a reference user input based on the result value output through the model 1010.
  • the processor 510 may identify (or determine) whether to perform an operation related to the reference user input by identifying information about whether the user input corresponds to the reference user input. For example, the processor 510 may identify whether to change the size of the display area by identifying information about whether the user input corresponds to a standard user input.
  • 11A and 11B show examples of operations of an electronic device according to an embodiment.
  • the processor 510 may display an area 1112 for receiving a user input for changing the size of the display area within the display area 1111 of the display 520.
  • Area 1112 may be predefined.
  • the area 1112 may be set (or changed) based on the size of the display area 1111.
  • the area 1112 may be set by the user of the electronic device 500.
  • the processor 510 may receive a user input to change the size of the display area 1111.
  • the processor 510 may display a visual object 1113 representing a path drawn by user input.
  • the user of the electronic device 500 can check the path on which at least one touch input corresponding to the user input was received through the visual object 1113 displayed through the display 520.
  • the processor 510 may identify whether the user input corresponds to the first reference user input. According to the first reference user input, the size of the display area 1111 of the electronic device 500 may be changed. For example, the size of the display area 1111 of the electronic device 500 according to the first reference user input may be set to the first size. The processor 510 may change the state of the electronic device 500 from state 1110 to state 1120 based on the user input corresponding to the first reference user input. The processor 510 can change the size of the display area 1111 of the display 520 by changing the state of the electronic device 500 from state 1110 to state 1120.
  • the processor 510 may change the state of the electronic device 500 by moving the second housing (eg, the second housing 220) in the direction 1122.
  • the processor 510 may change the size of the display area 1111 of the display 520 by moving the second housing (eg, the second housing 220) in the direction 1122.
  • the processor 510 may increase the size of the display area 1111 by moving the second housing in the direction 1122.
  • the processor 510 may display an area 1132 for receiving a user input for changing the size of the display area within the display area 1131 of the display 520.
  • Area 1132 may be predefined.
  • the area 1132 may be set (or changed) based on the size of the display area 1131.
  • the area 1132 may be set by the user of the electronic device 500.
  • the area 1132 may be located below the display area 1131.
  • the location or size of the area 1132 is exemplary and may be set to various locations or sizes.
  • the processor 510 may receive a user input to change the size of the display area 1131.
  • the processor 510 may display a visual object 1133 representing a path drawn by user input.
  • the user of the electronic device 500 can check the path on which at least one touch input corresponding to the user input was received through the visual object 1133 displayed through the display 520.
  • the processor 510 may identify whether a user input corresponds to a reference user input. Depending on the reference user input, the size of the display area 1131 of the electronic device 500 may be changed. For example, the size of the display area 1131 of the electronic device 500 according to the reference user input may be set to the second size.
  • the processor 510 may change the state of the electronic device 500 from state 1130 to state 1140 based on the user input corresponding to the second reference user input.
  • the processor 510 can change the size of the display area 1131 of the display 520 by changing the state of the electronic device 500 from state 1130 to state 1140.
  • the processor 510 may change the state of the electronic device 500 by moving the second housing (eg, the second housing 220) in the direction 1142.
  • the processor 510 may change the size of the display area 1131 of the display 520 by moving the second housing (eg, the second housing 220) in the direction 1132.
  • the processor 510 may reduce the size of the display area 1111 by moving the second housing in the direction 1142.
  • the first reference user input of FIG. 11A and the second reference user input of FIG. 11B may be set differently.
  • the first reference user input may be set to change the size of the display areas 1111 and 1131 to the first size.
  • the second user input may be set to change the size of the display areas 1111 and 1131 to the second size.
  • the first reference user input may be set to expand the display area of the display 520.
  • the second user input may be set to reduce the display area of the display 520.
  • Figure 12 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • the processor 510 may display a screen for setting a standard user input.
  • the processor 510 may display a screen for setting a reference user input for changing the state of the electronic device 500 through the display 520.
  • the processor 510 may set a reference user input.
  • the reference user input may be set to change the state of the electronic device 500. For example, based on identifying that a user input corresponding to a reference user input has been received, the processor 510 may change the state of the electronic device 500. As an example, based on identifying that a user input corresponding to a reference user input is received, the processor 510 may change the size of the display area of the display 520.
  • the processor 510 may receive at least one first touch input.
  • the processor 510 may receive at least one first touch input through a touch input field in the screen.
  • the processor 510 may display a touch input field on the screen.
  • the location or size of the touch input field may be changeable within the screen.
  • the processor 510 may change the location or size of the touch input field within the screen based on the user's request.
  • a touch input field may be an area where a reference user input is received.
  • the processor 510 may receive at least one first touch input within an area corresponding to the touch input field.
  • the processor 510 may identify whether at least one first touch input corresponds to a plurality of user inputs.
  • a plurality of user inputs may be received for use of the electronic device 500.
  • multiple user inputs can be distinguished from a reference user input.
  • the plurality of user inputs may include a scroll input for screen switching, a tap input for selecting an object, and a long touch input for changing the state of an object.
  • a plurality of user inputs may be set based on an application or the operating system of the electronic device 500.
  • the processor 510 may store information about a plurality of user inputs in the touch model database 551 within the memory 550.
  • the processor 510 may identify a plurality of touch inputs (or a plurality of third touch inputs) corresponding to a plurality of user inputs.
  • the processor 510 may learn a model (or a second model) indicated by a plurality of parameters based on a plurality of touch inputs.
  • the processor 510 may further display a visual object for guiding a plurality of user inputs and at least one other first touch input along with the touch input field.
  • the processor 510 may set the at least one first touch input as the reference user input. For example, the processor 510 may set at least one first touch input as a reference user input based on identifying that the at least one first touch input is different from a plurality of user inputs.
  • the processor 510 may set the reference user input for changing the state of the electronic device to at least one first touch input. For example, the processor 510 may change the state of the electronic device based on receiving at least one second touch input corresponding to the reference user input.
  • the processor 510 sets at least one other touch input that is distinct from the at least one first touch input as a reference user input.
  • Information for guidance can be displayed. For example, based on identifying that the at least one first touch input corresponds to at least some of the plurality of user inputs, the processor 510 may determine at least one touch input that is distinct from the at least one first touch input. Information can be displayed to guide setting one other touch input as the reference user input.
  • the at least one first touch input may be set as the reference user input.
  • the processor 510 may identify that a reference user input has been received even if the user did not intend to do so. Accordingly, based on identifying that the at least one first touch input corresponds to a plurality of user inputs, the processor 510 determines at least one other touch input that is distinct from the at least one first touch input. Information can be displayed to guide setting as a standard user input.
  • the processor 510 may identify whether at least one first touch input is distinguished from a plurality of user inputs (or existing touch operations).
  • the processor 510 may display a screen to request re-input based on the fact that at least one first touch input corresponds to at least one of a plurality of user inputs.
  • the processor 510 sets the at least one first touch input as a reference user input based on identifying that the at least one first touch input is distinguished from the plurality of user inputs, and stores the trigger pattern database in the memory 550. It can be stored in (552).
  • Figure 13 shows a flowchart regarding the operation of an electronic device according to an embodiment.
  • operations 1310 to 1330 may be performed after operation 1240 of FIG. 12 is performed.
  • the processor 510 may receive at least one second touch input.
  • the processor 510 may receive at least one second touch input after at least one first touch input is set as a reference user input.
  • the processor 510 may identify that at least one second touch input occurs within an area for receiving a user input for changing the state of the electronic device 500.
  • the processor 510 may receive at least one second touch input generated within the area.
  • the processor 510 may identify that at least one second touch input corresponds to a reference user input.
  • the processor 510 may identify whether at least one second touch input corresponds to a reference user input.
  • the processor 510 may identify the similarity between at least one first touch input and at least one second touch input.
  • the processor 510 may identify a reference value based on a predefined sensitivity.
  • the processor 510 may identify that the similarity between at least one first touch input and at least one second touch input is greater than or equal to the reference value.
  • the processor 510 may identify that the at least one second touch input corresponds to the reference user input based on identifying that the similarity of the at least one first touch input and the at least one second touch input is greater than or equal to the reference value. You can.
  • the processor 510 may change the state of the electronic device 500. For example, processor 510 may change the state of electronic device 500 in response to identifying that at least one second touch input corresponds to a reference user input. In one example, processor 510 may, in response to identifying that at least one second touch input correspond to a reference user input, move the second housing of electronic device 500 to increase the size of the display area of display 520. can be changed.
  • Figure 14 shows an example of an operation of an electronic device according to an embodiment.
  • the processor 510 may identify (or receive) a plurality of third touch inputs. For example, the processor 510 may identify a plurality of third touch inputs corresponding to a plurality of user inputs. For example, a plurality of user inputs may be distinguished from a reference user input for changing the state of the electronic device 500. For example, the processor 510 may identify a plurality of third touch inputs generated by use of the electronic device 500.
  • the processor 510 may store information about a plurality of third touch inputs in the touch model database 551 in the memory 550.
  • the processor 510 may learn the model 1410 indicated by a plurality of parameters based on information about the plurality of third touch inputs stored in the touch model database 551.
  • the processor 510 may collect information about previously defined touch inputs such as a long touch input, a drag input, or a scroll input (or information about a plurality of third touch inputs). The processor 510 may learn the model 1410 based on information about previously defined touch inputs.
  • the processor 510 may identify that a scroll input occurs based on capacitance values obtained in a plurality of areas.
  • the processor 510 may identify that a scroll input is generated based on identifying that the movement path of the touch input is short and the touch area becomes narrower toward the end.
  • the processor 510 may identify information about characteristics of the user's individual scroll input of the electronic device 500 (for example, a movement path or touch area of the touch input).
  • the processor 510 may learn the model 1410 based on the identified information.
  • a long touch input occurs based on capacitance values obtained in a plurality of areas.
  • the processor 510 may identify that a long touch input has occurred based on identifying that the touch is maintained for a predefined period of time.
  • the processor 510 may identify information about characteristics of the user's individual long touch input of the electronic device 500 (for example, the maximum amount of change in capacitance value according to the touch input).
  • the processor 510 may train the model 1410 based on the identified information.
  • 15A and 15B show examples of operations of an electronic device according to an embodiment.
  • the processor 510 may display a screen 1501 for setting a standard user input for changing the state of the electronic device 500 using the display 520.
  • the processor 510 may identify information about a plurality of user inputs stored in the touch model database 551.
  • the processor 510 may use the display 520 to display a screen 1501 for setting a reference user input that is distinguished from a plurality of user inputs.
  • the processor 510 may display a visual object 1502 for guiding at least one first touch input that is differentiated from a plurality of user inputs.
  • the processor 510 may use a model indicated by a plurality of parameters (eg, model 1410 in FIG. 14) to identify a user input that is distinct from a plurality of user inputs.
  • the processor 510 may display a visual object 1502 for guiding at least one touch input corresponding to the identified user input.
  • the processor 510 may receive at least one first touch input along the visual object 1502.
  • the processor 510 may set at least one received first touch input as a reference user input.
  • the processor 510 may display a screen 1503 for setting a standard user input for changing the state of the electronic device 500 using the display 520.
  • the processor 510 may display a screen 1503 for setting a reference user input that is differentiated from a plurality of user inputs using the display 520.
  • the processor 510 may display a visual object 1504 to guide at least one first touch input that is differentiated from a plurality of user inputs.
  • the processor 510 may use a model indicated by a plurality of parameters (eg, model 1410 in FIG. 14) to identify a user input that is distinct from a plurality of user inputs.
  • the processor 510 may display a visual object 1504 for guiding at least one touch input corresponding to the identified user input.
  • the processor 510 may receive at least one first touch input along the visual object 1504.
  • the processor 510 may set at least one received first touch input as a reference user input.
  • the processor 510 may display a list of recommended user inputs that are distinct from a plurality of user inputs.
  • the processor 510 may identify an input that selects one of the recommended user inputs displayed in the list, thereby setting the selected user input as a reference user input.
  • the processor 510 may set a plurality of reference user inputs.
  • the processor 510 may set a first reference user input for changing the state of the electronic device 500 from the first state to the second state.
  • the processor 510 may display a screen for setting the first reference user input while the display area of the display 520 is expanded.
  • the screen for setting the first reference user input may be set to screen 1501 in FIG. 15A.
  • the processor 510 may set a second reference user input for changing the state of the electronic device 500 from the second state to the first state.
  • the processor 510 may display a screen for setting the second reference user input while the display area of the display 520 is reduced.
  • the screen for setting the second reference user input may be set to screen 1503 in FIG. 15B.
  • 16A and 16B show examples of operations of an electronic device according to an embodiment.
  • the processor 510 may display touch input fields 1612 and 1622 within the screens 1611 and 1621 for setting a reference user input based on the state of the electronic device 500.
  • the processor 510 may display a touch input field 1612 in the screen 1611 for setting a reference user input.
  • the processor 510 may display a touch input field 1622 in the screen 1621 for setting a reference user input.
  • the processor 510 may set the location or size of the touch input fields 1612 and 1622 differently depending on the states 1610 and 1620 of the electronic device 500.
  • the processor 510 may set the position or size of the touch input fields 1612 and 1622 to be changeable.
  • the processor 510 may receive an input to change the position or size of the touch input fields 1612 and 1622.
  • the processor 510 may change the position or size of the touch input fields 1612 and 1622 based on the input.
  • the processor 510 may receive at least one first touch input to be set as a reference user input from the user in the touch input fields 1612 and 1622.
  • the processor 510 may set at least one first touch input as a reference user input based on identifying that the at least one first touch input is different from a plurality of user inputs.
  • the processor 510 may display information for guiding setting at least one other touch input as a reference user input based on identifying that at least one first touch input corresponds to at least some of the plurality of user inputs. You can.
  • the processor 510 may receive at least one second touch input after at least one first touch input is set as the reference user input. Processor 510 may change the state of the electronic device in response to identifying that at least one second touch input corresponds to the reference user input.
  • the processor 510 may set an area for receiving at least one second touch input differently depending on the state of the electronic device 500.
  • the processor 510 performs at least one second touch on the area corresponding to the touch input field 1612 set in the screen 1611 for setting the reference user input. It can be set as an area to receive input.
  • the processor 510 divides the area corresponding to the touch input field 1612 set in the screen 1611 for setting the reference user input into at least one 2 Can be set as an area to receive touch input.
  • the processor 510 When the state of the electronic device 500 is state 1620, the processor 510 performs at least one second touch on the area corresponding to the touch input field 1622 set in the screen 1621 for setting the reference user input. It can be set as an area to receive input. For example, when the display area of the display 520 is expanded, the processor 510 creates at least one area corresponding to the touch input field 1622 set in the screen 1621 for setting the reference user input. 2 Can be set as an area to receive touch input.
  • the processor 510 may individually set a reference user input for changing the state of the electronic device 500.
  • the processor 510 can set a reference user input that does not conflict with the plurality of user inputs by learning a model for a plurality of user inputs that are existing touch inputs.
  • an example in which the display area of the display 520 is changed based on a reference user input has been described, but the present invention is not limited thereto.
  • the processor 510 may perform various operations, such as displaying a multi-window, executing a predefined application, or switching screens, based on receiving at least one touch input corresponding to a reference user input.
  • an electronic device e.g., electronic device 500
  • a first housing e.g., electronic device 500
  • a second housing slidably coupled to the first housing, and a device connected to the first housing and the second housing.
  • a flexible display disposed on a surface formed by and insertable into or extractable from the first housing, to slide in or out of the second housing. It may include a driving device for driving, a memory, and at least one processor operatively connected to the flexible display, the driving device, and the memory.
  • the at least one processor may be set to identify user input through the flexible display.
  • the at least one processor may be configured to identify whether the user input corresponds to a reference user input.
  • the at least one processor may be set to change the size of the display area of the flexible display using the driving device based on identifying that the user input corresponds to the reference user input.
  • the at least one processor may be configured to bypass (or refrain from) changing the size of the display area using the driving device based on identifying that the user input is distinct from the reference user input.
  • the at least one processor may be set to identify first data about capacitance values obtained through the flexible display according to the user input.
  • the at least one processor may be configured to identify whether the user input corresponds to the reference user input based on the first data.
  • the user input may include at least one touch input.
  • the at least one processor may, based on the first data, provide information about an area where the at least one touch input is performed, information about the speed at which the at least one touch input is performed, and the at least one touch input. It may be set to identify at least one piece of information about the direction in which this is performed.
  • the at least one processor may be set to identify touch inputs identified from the time the user input starts to the time the user input is released as the at least one touch input.
  • the at least one processor may be set to identify capacitance values obtained according to a predefined period as the first data while the user input is maintained.
  • the capacitance values obtained through the flexible display may include capacitance values obtained from each of a plurality of areas included in the flexible display.
  • the at least one processor is set to identify whether the user input corresponds to the reference user input by setting the first data as input data of a model indicated by a plurality of parameters. It can be.
  • the at least one processor may be set to display a visual object representing a path drawn by the user input in response to a user input received through the flexible display.
  • the at least one processor may be configured to receive a request to change the sensitivity for identifying whether the user input corresponds to the reference user input.
  • the at least one processor may be further configured to change a sensitivity for identifying whether the user input corresponds to the reference user input from a first sensitivity to a second sensitivity based on the request.
  • the at least one processor changes the size of the display area of the flexible display to a first size using the driving device, based on identifying that the user input corresponds to the reference user input. It can be set to do so.
  • the at least one processor is based on identifying that the user input is different from the reference user input, and the user input corresponds to another reference user input that is distinct from the reference user input. It can be set to identify whether or not The at least one processor, based on identifying that the user input corresponds to the other reference user input, uses the driving device to adjust the size of the display area of the flexible display to a second size that is distinct from the first size. It can be set to change to .
  • the at least one processor may be configured to bypass changing the size of the display area using the driving device based on identifying that the user input is different from the other reference user input.
  • the user input may be received within a predefined area within the flexible display.
  • the predefined area may be changed based on the size of the display area.
  • the driving device may include a motor, a rack gear coupled to the second housing, and a pinion gear coupled to the motor through a shaft.
  • an electronic device e.g., electronic device 500
  • a first housing e.g., electronic device 500
  • a second housing slidably coupled to the first housing, and a device connected to the first housing and the second housing.
  • a flexible display, a memory, and a flexible display disposed on a surface formed by the first housing, insertable into or extractable from the first housing, and operably with the flexible display and the memory.
  • It may include at least one processor connected to it.
  • the at least one processor may be set to output a screen for setting a reference user input for changing the state of the electronic device using the flexible display.
  • the at least one processor may be set to receive at least one first touch input through a touch input field in the screen.
  • the at least one processor may be configured to identify whether the at least one first touch input corresponds to a plurality of user inputs.
  • the at least one processor may be configured to set the at least one first touch input as the reference user input based on identifying that the at least one first touch input is different from a plurality of user inputs. .
  • the at least one processor in response to receiving at least one second touch input after the at least one first touch input is set as the reference user input, sets the at least one second touch input to the reference user input. It can be set to identify whether it corresponds to the input.
  • the at least one processor may be configured to switch the state of the electronic device in response to identifying that the at least one second touch input corresponds to the reference user input.
  • the at least one processor outputs a visual object for guiding the at least one first touch input, which is different from the plurality of user inputs, on the screen for setting the reference user input. It can be set to do so.
  • the at least one processor is configured to determine at least one touch input that is distinguished from the at least one first touch input based on identifying that the at least one touch input corresponds to at least some of the plurality of user inputs. It may be set to output information to guide setting one other touch input as the reference user input.
  • the electronic device may include a driving device for changing the size of the flexible display.
  • the at least one processor in response to identifying that the at least one second touch input corresponds to the reference user input, changes the state of the electronic device from a first state in which the display is reduced in size to the display. can be set to switch to the expanded second state, via the driving device.
  • the position or size of the touch input field may be set to be changeable within the screen.
  • the at least one processor may be configured to identify a plurality of third touch inputs corresponding to the plurality of user inputs.
  • the at least one processor may be further configured to learn a model indicated by a plurality of parameters included in the memory, based on the plurality of third touch inputs.
  • a method of an electronic device may include receiving a user input through a flexible display of the electronic device.
  • the method may include identifying whether the user input corresponds to a reference user input.
  • the method may include changing the size of the display area of the flexible display using a driving device of the electronic device, based on identifying that the user input corresponds to the reference user input.
  • the method may include bypassing changing the size of the display area using the driving device based on identifying that the user input is different from the reference user input.
  • the operation of identifying whether the user input corresponds to the reference user input includes identifying first data about capacitance values obtained through the flexible display according to the user input, and Based on the first data, the method may include identifying whether the user input corresponds to the reference user input.
  • the user input may consist of at least one touch input.
  • the method includes, based on the first data, information about an area where the at least one touch input is performed, information about the speed at which the at least one touch input is performed, and information about the speed at which the at least one touch input is performed. It may include an operation of identifying at least one piece of information about direction.
  • the method may further include identifying touch inputs identified from the time the user input starts to the time the user input is released as the at least one touch input.
  • the operation of identifying the first data may include identifying capacitance values obtained according to a predefined period as the first data while the user input is maintained.
  • the capacitance values obtained through the flexible display may include capacitance values obtained from each of a plurality of areas included in the flexible display.
  • the operation of identifying whether the user input corresponds to the reference user input is performed by setting the first data as input data of a model indicated by a plurality of parameters, so that the user input corresponds to the reference user input. It may include an operation to identify whether it corresponds to a standard user input.
  • the method may include displaying a visual object representing a path drawn by the user input in response to a user input received through the flexible display.
  • the method may include identifying a request to change sensitivity to identify whether the user input corresponds to the reference user input.
  • the method may include changing a sensitivity for identifying whether the user input corresponds to the reference user input from a first sensitivity to a second sensitivity based on the request.
  • the operation of changing the size of the display area of the flexible display is based on identifying that the user input corresponds to the reference user input, and using the driving device to change the size of the display area of the flexible display.
  • An operation of changing the size to the first size may be included.
  • the method determines whether the user input corresponds to another reference user input that is distinct from the reference user input, based on identifying that the user input is different from the reference user input. It may include an identifying operation.
  • the method includes changing the size of the display area of the flexible display to a second size distinct from the first size using the driving device, based on identifying that the user input corresponds to the other reference user input.
  • the method may include bypassing changing the size of the display area using the driving device based on identifying that the user input is different from the other reference user input.
  • the user input may be received within a predefined area within the flexible display.
  • the predefined area may be changed based on the size of the display area.
  • the driving device may include a motor, a rack gear coupled to the second housing, and a pinion gear coupled to the motor through a shaft.
  • the at least one processor may be set to identify a touch pattern based on the user input.
  • the at least one processor may be configured to identify that the user input corresponds to the reference user input based on identifying that the touch pattern corresponds to a trigger pattern.
  • the at least one processor may be configured to change the size of the display area based on identifying that the user input corresponds to the reference user input.
  • the at least one processor may be set to identify a touch pattern based on another user input that is distinct from the user input.
  • the at least one processor may be set to match the touch pattern to one of at least one operation for changing the size of the display area.
  • the at least one processor may be set to store the touch pattern and an operation matched to the touch pattern in the memory.
  • the electronic device is.
  • a flexible display extractable from the first housing, a driving device for driving the second housing to slide in or out, a memory, and an operative device with the flexible display, the driving device, and the memory.
  • It may include at least one processor connected.
  • the at least one processor may be set to identify a touch pattern based on user input.
  • the at least one processor may be set to match the touch pattern to one of at least one operation for changing the size of the display area.
  • the at least one processor may be set to store the touch pattern and an operation matched to the touch pattern in the memory.
  • a method of an electronic device may include displaying a screen for setting a reference user input for changing the state of the electronic device using a flexible display of the electronic device.
  • the method may include receiving at least one first touch input through a touch input field in the screen.
  • the method may include identifying whether the at least one first touch input corresponds to a plurality of user inputs.
  • the method may include setting the at least one first touch input as the reference user input based on identifying that the at least one first touch input is different from a plurality of user inputs.
  • the method may include, after the at least one first touch input is set as the reference user input, in response to receiving at least one second touch input, wherein the at least one second touch input corresponds to the reference user input. It may include an operation to identify whether or not to do so.
  • the method may include changing a state of the electronic device in response to identifying that the at least one second touch input corresponds to the reference user input.
  • the method includes displaying a visual object for guiding the at least one first touch input, which is different from the plurality of user inputs, on the screen for setting the reference user input. It can be included.
  • At least one other touch input that is distinct from the at least one first touch input is generated. It may include an operation to display information to guide what is set as a standard user input.
  • the operation of changing the state of the electronic device may include changing the state of the electronic device to the size of the display in response to identifying that the at least one second touch input corresponds to the reference user input. It may include an operation of changing from a first state in which the display is reduced to a second state in which the display is expanded, through a driving device of the electronic device.
  • the position or size of the touch input field may be set to be changeable within the screen.
  • the method may include identifying a plurality of third touch inputs corresponding to the plurality of user inputs.
  • the method may include training a model indicated by a plurality of parameters stored in the memory based on the plurality of third touch inputs.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one element from another, and may be used to distinguish such elements in other respects, such as importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”.
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • a computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play Store), or on two user devices (e.g. : Smartphones) can be distributed (e.g. downloaded or uploaded) directly or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play Store
  • two user devices e.g. : Smartphones
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, un dispositif électronique comprend un premier boîtier, un second boîtier, un dispositif d'affichage flexible, un dispositif de commande, une mémoire, et au moins un processeur. Le ou les processeurs sont configurés pour : recevoir une entrée d'utilisateur par l'intermédiaire de l'affichage flexible ; identifier si l'entrée d'utilisateur correspond à une entrée d'utilisateur de référence ; sur la base de l'identification du fait que l'entrée d'utilisateur correspond à l'entrée d'utilisateur de référence, modifier une taille d'une zone d'affichage de l'affichage flexible à l'aide du dispositif de commande ; et, sur la base de l'identification du fait que l'entrée d'utilisateur est différente de l'entrée d'utilisateur de référence, contourner la modification de la taille de la zone d'affichage à l'aide du dispositif de commande.
PCT/KR2023/002036 2022-03-21 2023-02-10 Dispositif électronique, et procédé permettant d'identifier une entrée d'utilisateur WO2023182654A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20220034949 2022-03-21
KR10-2022-0034949 2022-03-21
KR10-2022-0071778 2022-06-13
KR1020220071778A KR20230137204A (ko) 2022-03-21 2022-06-13 사용자 입력을 식별하기 위한 전자 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2023182654A1 true WO2023182654A1 (fr) 2023-09-28

Family

ID=88101339

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/002036 WO2023182654A1 (fr) 2022-03-21 2023-02-10 Dispositif électronique, et procédé permettant d'identifier une entrée d'utilisateur

Country Status (1)

Country Link
WO (1) WO2023182654A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101595381B1 (ko) * 2009-06-10 2016-02-18 엘지전자 주식회사 이동 단말기 및 그의 제스처 입력 처리방법
JP2021043821A (ja) * 2019-09-12 2021-03-18 アスツール株式会社 端末ブログラム、端末装置、制御方法、制御システム、および記録媒体
WO2021095925A1 (fr) * 2019-11-14 2021-05-20 엘지전자 주식회사 Terminal mobile et procédé de commande associé
WO2021160276A1 (fr) * 2020-02-14 2021-08-19 Huawei Technologies Co., Ltd. Geste de roulement et prévention de mauvaise manipulation tactile sur des dispositifs roulants
KR20220014709A (ko) * 2020-07-29 2022-02-07 삼성전자주식회사 슬라이딩 가능한 전자 장치 및 이의 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101595381B1 (ko) * 2009-06-10 2016-02-18 엘지전자 주식회사 이동 단말기 및 그의 제스처 입력 처리방법
JP2021043821A (ja) * 2019-09-12 2021-03-18 アスツール株式会社 端末ブログラム、端末装置、制御方法、制御システム、および記録媒体
WO2021095925A1 (fr) * 2019-11-14 2021-05-20 엘지전자 주식회사 Terminal mobile et procédé de commande associé
WO2021160276A1 (fr) * 2020-02-14 2021-08-19 Huawei Technologies Co., Ltd. Geste de roulement et prévention de mauvaise manipulation tactile sur des dispositifs roulants
KR20220014709A (ko) * 2020-07-29 2022-02-07 삼성전자주식회사 슬라이딩 가능한 전자 장치 및 이의 제어 방법

Similar Documents

Publication Publication Date Title
WO2022025450A1 (fr) Dispositif électronique coulissant et procédé de commande associé
WO2022098125A1 (fr) Dispositif électronique et son procédé de commande d'écran
WO2022085961A1 (fr) Dispositif électronique pour déliver un contenu, et procédé de foncionnement de dispositif électronique
WO2022108271A1 (fr) Dispositif électronique comprenant un afficheur flexible, et procédé de commande tactile associé
WO2022114848A1 (fr) Dispositif électronique souple et procédé de fonctionnement d'un écran de visualisation de caméra
WO2022086068A1 (fr) Dispositif électronique, et procédé de fonctionnement de dispositif électronique
WO2023182654A1 (fr) Dispositif électronique, et procédé permettant d'identifier une entrée d'utilisateur
WO2024117558A1 (fr) Dispositif électronique comportant un écran flexible et son procédé de commande
WO2024101789A1 (fr) Dispositif électronique comprenant une unité d'affichage flexible, et procédé de commande de dispositif électronique
WO2023171879A1 (fr) Dispositif électronique comprenant une pluralité de cartes de circuit imprimé flexibles
WO2023200178A1 (fr) Dispositif électronique comprenant un dispositif d'affichage enroulable, et procédé de commande associé
WO2023191297A1 (fr) Dispositif électronique et procédé de fourniture de notification d'état anormal d'affichage
WO2024063432A1 (fr) Procédé de commande d'écran d'exécution d'application de dispositif électronique et dispositif électronique associé
WO2023191303A1 (fr) Dispositif électronique comprenant un dispositif d'affichage souple et son procédé de commande
WO2024117619A1 (fr) Dispositif électronique, et procédé de lecture vidéo basé sur l'extension ou la réduction d'un affichage flexible dans un dispositif électronique
WO2023191254A1 (fr) Dispositif électronique comprenant un écran souple
WO2023068468A1 (fr) Dispositif électronique et procédé d'identification de l'état de préhension d'un dispositif électronique
WO2023191245A1 (fr) Dispositif électronique comprenant des moteurs à double vibration
WO2023182651A1 (fr) Dispositif électronique comprenant une structure de masse utilisant un assemblage de haut-parleurs
WO2024049080A1 (fr) Dispositif électronique comprenant un écran flexible, et procédé de réduction d'écart de qualité d'image dans un écran flexible
WO2023204443A1 (fr) Dispositif électronique pour partager un écran avec un dispositif électronique externe, procédé de fonctionnement associé et support de stockage
WO2023195827A1 (fr) Dispositif électronique comprenant un afficheur flexible, et procédé de fonctionnement pour dispositif électronique
WO2022154536A1 (fr) Appareil électronique et procédé permettant de commander une fonction d'application sur la base d'une interaction tactile détectée dans un circuit d'entrée de touche l'utilisant
WO2022108239A1 (fr) Dispositif électronique à affichage flexible et procédé de fonctionnement dudit dispositif
WO2023249206A1 (fr) Dispositif électronique et procédé servant à effectuer une opération se rapportant à une application logicielle à des fins de gestion d'éléments d'agenda

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23775157

Country of ref document: EP

Kind code of ref document: A1