US10956763B2 - Information terminal device - Google Patents

Information terminal device Download PDF

Info

Publication number
US10956763B2
US10956763B2 US16/238,048 US201916238048A US10956763B2 US 10956763 B2 US10956763 B2 US 10956763B2 US 201916238048 A US201916238048 A US 201916238048A US 10956763 B2 US10956763 B2 US 10956763B2
Authority
US
United States
Prior art keywords
section
region
image
divided regions
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/238,048
Other languages
English (en)
Other versions
US20190220681A1 (en
Inventor
Fumiya SAKASHITA
Yoichi Hiranuma
Shoichi Sakaguchi
Shohei Fujiwara
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Document Solutions Inc
Original Assignee
Kyocera Document Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Document Solutions Inc filed Critical Kyocera Document Solutions Inc
Assigned to KYOCERA DOCUMENT SOLUTIONS INC. reassignment KYOCERA DOCUMENT SOLUTIONS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUJIWARA, SHOHEI, HIRANUMA, Yoichi, SAKAGUCHI, SHOICHI, SAKASHITA, FUMIYA
Publication of US20190220681A1 publication Critical patent/US20190220681A1/en
Application granted granted Critical
Publication of US10956763B2 publication Critical patent/US10956763B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/2054
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00071Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for characterised by the action taken
    • H04N1/00074Indicating or reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06K9/00442
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00411Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00413Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00464Display of information to the user, e.g. menus using browsers, i.e. interfaces based on mark-up languages

Definitions

  • This disclosure relates to an information terminal device, an information processing system, and a computer-readable non-transitory recording medium storing a display control program, and more specifically to a technology for switching a plurality of divided displays on a display screen.
  • the web browser is application software for viewing a web page.
  • Application software for receiving provision of a service of viewing a monitored image by an application server is one example.
  • a typical image processing system includes: a plurality of (for example, four) monitoring cameras; an application server providing a service of viewing a monitored image; and a portable terminal into which a web browser provided by the application server is installed.
  • screen information which permits the monitored images photographed with the four monitoring cameras to be displayed on the web browser is provided, and the monitored images of the four monitoring cameras can be displayed on a display screen of the portable terminal.
  • This portable terminal is capable of four-screen display including one main screen with a large display region and three sub-screens with a small display region vertically arranged on a right side of the main screen.
  • the main screen is displayed on an enlarged scale on the entire display screen.
  • An information terminal device includes: a display section, an operation section, a remaining region specification section, an arrangement region determination section, a reduced arrangement section, and a display control section.
  • the display section displays an image in each of a plurality of divided regions obtained by plurally dividing a display screen.
  • the operation section receives, from a user, enlargement operation of enlarging one of the plurality of divided regions on the display screen.
  • the remaining region specification section specifies a remaining region on the display screen excluding the one divided region enlarged through the enlargement operation of the operation section by the user.
  • the arrangement region determination section determines an arrangement region for arranging all the divided regions other than the one enlarged divided region in the remaining region specified by the remaining region specification section.
  • the reduced arrangement section arranges each of the other divided regions on a reduced scale in the arrangement region determined by the arrangement region determination section.
  • the display control section displays, on an enlarged scale, an image of the one enlarged divided region, in the one enlarged divided region, and also displays, on an enlarged scale within the other divided regions arranged on a reduced scale by the reduced arrangement section, an image of a specific portion of the other divided regions.
  • the information terminal device capable of displaying a plurality of images distributed via a network is communicable with an application server.
  • the information terminal device includes: a request section, a display section, an operation section, a remaining region specification section, an arrangement region determination section, a reduced arrangement section, a notification section, an acquisition section, and a display control section.
  • the application server includes a transmission section.
  • the request section provides distribution sources of the plurality of images with a request for the plurality of images.
  • the display section displays, in each of a plurality of divided regions obtained by plurally dividing a display screen, the plurality of images requested by the request section.
  • the operation section receives, from a user, enlargement operation of enlarging one of the plurality of divided regions on the display screen.
  • the remaining region specification section specifies a remaining region on the display screen excluding the one divided region enlarged through the enlargement operation of the operation section by the user.
  • the arrangement region determination section determines an arrangement region for arranging all the divided regions other than the one enlarged divided region in the remaining region specified by the remaining region specification section.
  • the reduced arrangement section arranges each of the other divided regions on a reduced scale in the arrangement region determined by the arrangement region determination section.
  • the notification section outputs, to the application server, notification related to the one divided region enlarged through the enlargement operation of the operation section by the user and related to the other divided regions arranged on a reduced scale in the arrangement region by the reduced arrangement section.
  • the acquisition section acquires screen information transmitted by the transmission section of the application server.
  • the display control section uses the screen information acquired by the acquisition section to display an image of the one enlarged divided region on an enlarged scale in the one divided region and also to display, on an enlarged scale within the other divided regions arranged on a reduced scale by the reduced arrangement section, an image of a specific portion of the other divided regions.
  • the transmission section of the application server transmits screen information of the one enlarged divided region and the other reduced divided regions on the display screen in the information terminal device in accordance with the notification provided from the notification section.
  • the display control program causes a computer including a processor to, through execution of the display control program by the processor, function as: a remaining region specification section, an arrangement region determination section, a reduced arrangement section, and a display control section.
  • the remaining region specification section specifies a remaining region on a display screen excluding one of a plurality of divided regions on the display screen, the one divided region being enlarged through enlargement operation of an operation section by a user.
  • the arrangement region determination section determines an arrangement region for arranging all the divided regions other than the one enlarged divided region in the remaining region specified by the remaining region specification section.
  • the reduced arrangement section arranges each of the other divided regions on a reduced scale in the arrangement region determined by the arrangement region determination section.
  • the display control section displays, on an enlarged scale, an image of the one enlarged divided region, in the one enlarged divided region, and also displays, on an enlarged scale within the other divided regions arranged on a reduced scale by the reduced arrangement section, an image of a specific portion of the other divided regions.
  • FIG. 1 is a block diagram illustrating configuration of an information terminal device and an information processing system according to one embodiment of this disclosure.
  • FIG. 2 is a flowchart illustrating image display processing performed in the information processing system.
  • FIG. 3 is a flowchart illustrating processing for providing enlarged display of one of a plurality of divided regions on a display screen of the information terminal device.
  • FIG. 4 is a diagram illustrating one example in which respective images of four monitoring cameras are displayed in the four divided regions of the display screen of the information terminal device.
  • FIG. 5 is a diagram illustrating that the divided region located at a left top of the display screen illustrated in FIG. 4 is subjected to enlargement operation.
  • FIG. 6 is a diagram illustrating the other divided regions vertically rearranged at a right following the enlargement operation performed on the left top divided region illustrated in FIG. 5 .
  • FIG. 7 is a diagram illustrating enlarged display of images of the divided regions vertically arranged at the right together with a type display image and an event occurrence table image displayed in an empty region.
  • FIG. 8 is a diagram illustrating that only the event occurrence table image is displayed in the empty region illustrated in FIG. 6 in a case where the empty region is small.
  • FIG. 9 is a flowchart illustrating event detection processing performed in an application server of a modified example.
  • FIG. 1 is a block diagram illustrating configuration of the information terminal device and the information processing system of this disclosure.
  • the information processing system 100 of this embodiment makes it possible to view, on a web browser of the information terminal device 10 such as a tablet computer, each of monitored images photographed with a plurality of monitoring cameras 31 to 34 of a distribution device 30 .
  • the information processing system 100 includes: the information terminal device 10 , the distribution device 30 , an application server 40 , and a network 50 .
  • the information terminal device 10 , the distribution device 30 , and the application server 40 are connected together in a manner such as to be communicable with each other via the network 50 .
  • the distribution device 30 includes: the plurality of (four in this embodiment) monitoring cameras 31 to 34 ; and a router 35 which is communicable with the information terminal device 10 and the application server 40 .
  • the router 35 is connected to the monitoring cameras 31 to 34 .
  • each of the monitoring cameras 31 to 34 is a device which photographs a static image and a moving image (they are collectively referred to as images), and the moving image is also referred to as a video.
  • a plurality of images can be provided by one monitoring camera.
  • the application server 40 is a web application server which manages IP addresses of the four monitoring cameras 31 to 34 connected to the router 35 and which provides screen information permitting image display on the browser in a case where the information terminal device 10 has provided a request for viewing each of the images photographed by these monitoring cameras 31 to 34 .
  • the information terminal device 10 is a portable terminal device such as a tablet computer, and includes: a display section 12 , a touch panel 13 , a communication section 14 , a storage section 15 , and a control unit 20 . These components are capable of data or signal transmission and reception to and from each other via a bus.
  • the information terminal device 10 has a browser installed therein for receiving services provided by the application server 40 .
  • the information terminal device 10 may be a portable terminal device such as a smartphone or a disk-top personal computer.
  • the display section 12 is formed of, for example, a liquid crystal display (LCD) or an organic EL (organic light-emitting diode (OLED)).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the touch panel 13 is of, for example, a so-called resistive film type or a capacitance type.
  • the touch panel 13 is arranged on a screen of the display section 12 and detects contact of a finger or the like on the screen of the display section 12 together with a position of this contact.
  • the touch panel 13 outputs a detection signal indicating coordinates of the position of the aforementioned contact to a control section 21 of the control unit 20 . Therefore, the touch panel 13 plays a role as an operation section to which user operation performed on the screen of the display section 12 is inputted.
  • the information terminal device 10 may include, as an operation section to which the user operation is inputted, hard keys in addition to the aforementioned touch panel 13 .
  • the communication section 14 is a communication interface which includes a communication module such as a wireless LAN chip, not illustrated.
  • the communication section 14 has a function of making communication between the distribution device 30 and the application server 40 .
  • the storage section 15 is composed of: a large-capacity solid state drive (SSD), a hard disk drive (HDD), etc., and stores various pieces of data and programs.
  • SSD solid state drive
  • HDD hard disk drive
  • the control unit 20 is composed of a processor, a random-access memory (RAM), a read only memory (ROM), and the like.
  • the processor is, for example, a central processing unit (CPU), an MPU, or an ASIC.
  • this control unit 20 functions as the control section 21 , a communication control section 22 , a remaining region specification section 23 , an arrangement region determination section 24 , a reduced arrangement section 25 , a display control section 26 , an image judgment section 27 , and a determination section 28 .
  • the aforementioned components may each be formed by a hardware circuit without depending on operation performed based on the aforementioned control program.
  • the control section 21 is in charge of overall operation control of the information terminal device 10 .
  • the control section 21 is also connected to the display section 12 , the touch panel 13 , the communication section 14 , the storage section 15 , etc., and performs operation control of each of the aforementioned components and signal or data transmission and reception to and from each component.
  • the control section 21 controls display operation of the display section 12 in particular.
  • the communication control section 22 has a function of controlling communication operation of the communication section 14 .
  • the display control section 26 performs control in a manner such that a display screen of the display section 12 provides, for example, four-screen display, as illustrated in FIG. 4 to be described later on. More specifically, the display control section 26 performs control in a manner such that the monitored images respectively photographed with the four monitoring cameras 31 to 34 are respectively displayed in four divided regions D 1 to D 4 which are obtained by equally and quarterly dividing the display screen of the display section 12 to a left top, a right top, a left bottom, and a right bottom. Moreover, as illustrated in FIG. 4 , the display control section 26 causes display of “C 1 ” to “C 4 ” at respective left top corner parts of the divided regions D 1 to D 4 , indicating that they are the images respectively photographed by the monitoring cameras 31 to 34 .
  • the monitored images of the monitoring cameras 31 to 34 are illustrated in a circular shape, a triangular shape, a pentagonal shape, and a star shape, respectively.
  • a subject for example, illustrated in the circular shape
  • the triangular shape, the pentagonal shape, and the star shape respectively illustrated in the other divided regions D 2 to D 4 are excluded from a monitoring target.
  • control section 21 specifies user operation inputted by a user. Then the control section 21 performs control in accordance with the specified user operation.
  • the aforementioned user operation is, for example, touch operation, drag operation, or swipe operation.
  • touch panel 13 upon separation of user's finger immediately after contact of his or her finger on the touch panel 13 , the touch panel 13 outputs, to the control section 21 , a detection signal indicating a position where the contact has been detected.
  • the control section 21 detects, based on this detection signal, that the user operation is touch operation.
  • This touch operation is performed on, for example, a soft key on the screen of the display section 12 .
  • the touch panel 13 outputs, to the control section 21 , a detection signal indicating each moving position from an initial position where first contact has been detected to a final position where final contact has been detected after maintaining the initial position for predefined time (for example, 0.5 seconds).
  • the control section 21 detects, based on this detection signal, that the user operation is drag operation as well as an operation direction of the drag operation and the final position.
  • the drag operation includes, for example, enlargement operation performed on a corner part of the one divided region on the display of the display section 12 upon enlargement of the aforementioned one divided region.
  • the touch panel 13 outputs, to the control section 21 , the detection signal indicating each moving position from the initial position P 1 to the final position P 2 after maintaining the initial position P 1 for the predefined time.
  • the control section 21 Upon input of this detection signal, the control section 21 detects, based on the detection signal, that the user operation is drag operation (enlargement operation), the operation direction of the drag operation, and the final position P 2 .
  • the drag operation is defined as enlargement operation of maintaining an aspect ratio of the divided region D 1 (the same applies to a case where any one of the divided regions D 2 to D 4 is enlarged) but may be enlargement operation with which the aspect ratio is not maintained.
  • the touch panel 13 outputs, to the control section 21 , the detection signal indicating each moving position from the initial position where the initial contact has been detected to the final position where the final contact has been detected.
  • the control section 21 detects, based on the detection signal, that the user operation is swipe operation and an operation direction of the swipe operation. This swipe operation is performed, for example, on an icon on the screen of the display section 12 upon moving the icon.
  • the remaining region specification section 23 specifies a remaining region A 1 on the display screen of the display section 12 excluding the one divided region D 1 enlarged through the enlargement operation (drag operation) of the touch panel 13 by the user.
  • drag operation enlargement operation
  • the remaining region specification section 23 specifies the remaining region A 1 on the display screen of the display section 12 excluding the enlarged divided region D 1 whose right bottom corner part has moved to the final position P 2 .
  • the arrangement region determination section 24 determines, in the remaining region A 1 specified by the remaining region specification section 23 , an arrangement region A 3 for arranging all the other divided regions D 2 to D 4 other than the one enlarged divided region D 1 .
  • the arrangement region determination section 24 determines, as the arrangement region A 3 , a region portion of the display screen located at a right of the enlarged divided region D 1 whose right bottom corner part has moved to the final position P 2 .
  • the reduced arrangement section 25 arranges each of the other divided regions D 2 to D 4 on a reduced scale in the arrangement region A 3 determined by the arrangement region determination section 24 .
  • the reduced arrangement section 25 reduces the divided regions D 2 to D 4 so as to maintain the aspect ratio in this embodiment (the same applies to a case where reduction including the divided region D 1 is performed) but the reduced arrangement section 25 may perform reduction with which the aspect ratio is not maintained.
  • the image judgment section 27 execute image recognition processing (for example, known face recognition processing or human recognition processing) on the images displayed in the other divided regions (the divided regions D 2 to D 4 of FIG. 6 ) to thereby judge whether or not a subject or an object is included in the aforementioned images.
  • the image judgment section 27 performs image recognition on the image displayed in the divided region D 1 subjected to the enlargement operation as illustrated in FIG. 5 , and specifies, as a subjected to be monitored, the subject (for example, illustrated in a circle) included in the image of the divided region D 1 .
  • the image judgment section 27 executes known face authentication processing to thereby extract, based on image information displayed in the enlarged divided region D 1 , face authentication information of the subject (for example, information indicating feature points of the face, such as eyes, nose, and mouse, in a face region determined from the image of the subject). Then the image judgment section 27 stores the extracted face authentication information of the subject as collation source registration data into the storage section 15 . Next, the image judgment section 27 judges, for each of the divided regions D 2 to D 4 , whether or not the subject to be monitored (for example, illustrated in the circle) is included in the images displayed in the other divided regions (the divided regions D 2 to D 4 of FIG. 6 ).
  • the image judgment section 27 executes the known face authentication processing to thereby judge whether or not the face authentication information of the subject extracted based on the image information of each of the divided regions D 2 to D 4 matches the collation source registration data of the storage section 15 , and in a case where the aforementioned face authentication information matches the data, the image judgment section 27 judges that the subject to be monitored is included in the images of the divided regions D 2 to D 4 . Note that it is illustrated that the subject to be monitored (for example, illustrated in the circle) is included in the images of the divided regions D 2 to D 4 of FIG. 6 .
  • the display control section 26 displays, on an enlarged scale in the one enlarged divided region D 1 , the image of the one divided region D 1 , and in a case where the subject to be monitored (for example, illustrated in the circle) has been displayed in the divided regions D 2 to D 4 arranged on a reduced scale by the reduced arrangement section 25 (that is, in a case where the image judgment section 27 has judged that the subject is included in the images of the other divided regions D 2 to D 4 ), the display control section 26 displays an image of a specific portion SE including the subject to be monitored (for example, illustrated in the circle) on an enlarged scale within the other divided regions D 2 to D 4 .
  • the display control section 26 displays, on an enlarged scale in the divided region D 2 as illustrated in FIG. 7 , the image of the specific portion SE (that is, the image of the subject) as one portion of the image of the other divided region D 2 as illustrated in FIG. 6 .
  • the specific portions SE (that is, the subjects) of the other divided regions D 3 and D 4 are each displayed on an enlarged scale.
  • the specific portion SE (that is, the subject) of the divided regions D 2 to D 4 is displayed on an enlarged scale, thus permitting more visible display of the specific portions SE of the divided regions D 2 to D 4 .
  • the display control section 26 may display, as the specific portion SE on an enlarged scale in the other divided regions D 2 to D 4 , a portion (for example, a rectangular region portion including the face of the subject) including the face of the subject subjected to the face recognition performed by the image judgment section 27 .
  • the specific portions SE that is, the face as a feature portion of the subject
  • the specific portions SE of the divided regions D 2 to D 4 are displayed on an enlarged scale even though the divided regions D 2 to D 4 have been reduced, thus permitting even more visible display of the specific portions SE of the divided regions D 2 to D 4 .
  • a monitoring target is the subject to be monitored (for example, illustrated in the circle) as illustrated in FIG. 5 , but a predefined subject (for example, an extinguisher) may be provided as the monitoring target.
  • the image judgment section 27 can execute, for example, known pattern matching image processing to thereby judge whether or not the aforementioned subject (for example, the extinguisher) is included in the images of the divided regions D 1 to D 4 .
  • the display control section 26 displays, as the specific portions SE on an enlarged scale in the other divided regions D 2 to D 4 , a portion (for example, a rectangular region portion including all or part of the extinguisher) including the subject (for example, the extinguisher) recognized by the image judgment section 27 .
  • the specific portion SE (that is, the subject) of the divided regions D 2 to D 4 are displayed on an enlarged scale, thus permitting more visible display of the specific portion SE (that is, the subject) of the divided regions D 2 to D 4 .
  • the determination section 28 determines, based on the remaining region A 1 (see FIG. 5 ) specified by the remaining region specification section 23 , whether or not a ratio of an empty region A 2 (see FIG. 6 ), excluding the arrangement region A 3 determined by the arrangement region determination section 24 , occupying the display screen is equal to or greater than a predefined specific ratio (for example, 10%) or more.
  • a predefined specific ratio for example, 10%
  • the display control section 26 displays, in the empty region A 2 , a type display image MI indicating types (for example, information indicating photographing places) of the images of the plurality of divided regions and an event occurrence table image ETI indicating timing of occurrence of an event EV with respect to a photographing time axis extending in a horizontal direction, as illustrated in FIG. 7 .
  • a type display image MI indicating types (for example, information indicating photographing places) of the images of the plurality of divided regions
  • an event occurrence table image ETI indicating timing of occurrence of an event EV with respect to a photographing time axis extending in a horizontal direction, as illustrated in FIG. 7 .
  • the type display image MI is an images which indicates the photographing places such that the images in the divided regions D 1 to D 4 are respectively photographed with the monitoring camera 31 (camera C 1 ) located at a front on a first floor, the monitoring camera 32 (camera C 2 ) located at an east corridor on a second floor, the monitoring camera 33 (camera C 3 ) located at a west corridor on the second floor, and the monitoring camera 34 (camera C 4 ) located at a north corridor on the second floor.
  • the event occurrence table image ETI is an image indicating at which of the photographing times the event EV, for example, appearance of the subject to be monitored (for example, illustrated in the circle) has occurred.
  • the event occurrence table image ETI of this embodiment has a photographing time axis with a left end of this photographing time axis indicating a midnight of a corresponding day and a right end of the photographing time axis indicating current time of the same day. More specifically, upon judgment by the image judgment section 27 that the subject to be monitored is included in the images of the divided regions D 1 to D 4 , the display control section 26 displays the event EV at a portion of the event occurrence table image ETI corresponding to a time point at which the aforementioned judgment has been made. As illustrated in FIG. 7 , the user can view each event EV of the event occurrence table image ETI displayed in the empty region A 2 of the display section 12 to thereby recognize that the event EV has occurred.
  • an enlargement operation mark EX indicating that the one divided region has been enlarged is displayed at a portion of the event occurrence table image ETI corresponding to a period at which the aforementioned enlargement operation has been performed.
  • the user can view the enlargement operation mark EX of the event occurrence table image ETI displayed in the empty region A 2 of the display section 12 to thereby recognize that the enlargement operation has been performed.
  • the display control section 26 displays the event occurrence table image ETI in the empty region A 2 without displaying the type display image MI, as illustrated in FIG. 8 .
  • This permits appropriate display in accordance with a size of the empty region A 2 on the display screen of the display section 12 caused by the enlargement operation performed in the divided regions.
  • the display control section 26 displays the event EV indicated by the event occurrence table image ETI and a corresponding display image indicating correspondence with the divided region related to the aforementioned event EV.
  • This corresponding display image refers to the “C 1 ” to “C 4 ” displayed at positions located above the events EV of the event occurrence table image ETI, as illustrated in FIG. 7 .
  • the “C 1 ” as the corresponding display image displayed at the position above the event EV located at a left end of the event occurrence table image ETI illustrated in FIG. 7 is an event related to the image photographed with the monitoring camera 31 (camera C 1 ), that is, the event in the divided region D 1 .
  • the display control section 26 Upon enlarged display of the image in the one divided region D 1 based on the enlargement operation of the touch panel 13 by the user, the display control section 26 displays, at the portion of the event occurrence table image ETI corresponding to the period at which the enlargement operation has been performed, the enlargement operation mark EX indicating that this divided region D 1 has been enlarged.
  • the subject to be monitored (for example, illustrated in the circle) is displayed in the divided region D 1 at a past time point (three time points before a current time in FIG. 7 ) and the user has performed the enlargement operation at this point, and thus the enlargement operation mark EX is displayed at a portion of the event occurrence table image ETI corresponding to this past time point.
  • the subject to be monitored (for example, illustrated in the circle) sequentially appears at the monitoring camera 32 (the camera C 2 ), the monitoring camera 33 (the camera C 3 ), and the operation section 34 (the camera C 4 ) with time passage, and thus each event EV indicating the appearance of the subject to be monitored (for example, illustrated in the circle) is displayed on the event occurrence table image ETI.
  • the enlargement operation mark EX is displayed at the portion corresponding to the period of the event (for example, the enlargement operation) located at a second place from the left end of the event occurrence table image ETI in FIG. 7 .
  • the user can recognize that the enlargement operation has also been performed at this period.
  • image display processing performed in the information processing system 100 of this embodiment that is, processing for viewing each monitored image on the web browser in the information terminal device 10 will be described with reference to a flowchart illustrated in FIG. 2 , etc.
  • the control section 21 of the information terminal device 10 activates the browser (S 101 ), specifies a uniform resource locator (URL) of the application server 40 on the browser, and provides a request for access to this specified URL (S 102 ).
  • the application server 40 receives the request for access (S 103 ) and transmits the screen information for displaying a login screen (S 104 ).
  • the communication section 14 of the information terminal device 10 receives the screen information for displaying the login screen (S 105 ).
  • the control section 21 of the information terminal device 10 transmits authentication information composed of ID information and a password (S 106 ).
  • the application server 40 receives the authentication information (S 107 ), performs authentication processing by use of this received authentication information (S 108 ), and transmits the screen information for displaying a menu screen (operation menu) (S 109 ).
  • the communication section 14 of the information terminal device 10 receives the screen information for displaying the menu screen (S 110 ). Then the display control section 26 of the information terminal device 10 causes the display section 12 to display the menu screen. Displayed on this menu screen are, for example, operation items for selecting display of the monitored image provided from the distribution device 30 .
  • the control section 21 of the information terminal device 10 Upon selection of the operation item for selecting the display of the monitored image provided from the distribution device 30 on the menu screen of the display section 12 , the control section 21 of the information terminal device 10 provides the application server 40 with a request for displaying each of the monitored images photographed with the four monitoring cameras 31 to 34 (S 111 ).
  • the application server 40 receives the request for display (S 112 ) and specifies frames (S 113 ). Display of plural screen display is set in initial setting (default), and thus the frames of the plural screen display is specified. More specifically, specified is the screen information displaying the divided screen (divided regions) with correspondence between a plurality of pieces of frame information for displaying the plurality of images and the IP addresses of the monitoring cameras displayed in the respective frame regions.
  • the frames referred to as four screen frames in this embodiment has the four divided regions obtained by quarterly and equally dividing the display screen of the display section 12 into four including the left top, the right top, the left bottom, and the right bottom. They are expressed in a structured language such as an HTML format.
  • the application server 40 transmits the screen information for displaying the four screen frames (S 114 ).
  • the communication section 14 of the information terminal device 10 receives the screen information for displaying the four screen frames (S 115 ).
  • the control section 21 of the information terminal device 10 provides the four monitoring cameras 31 to 34 with a request for the images (S 116 ). More specifically, by using the IP addresses of the four monitoring cameras 31 to 34 corresponding to the respective frames included in the screen information for displaying the four screen frames, the control section 21 provides the four monitoring cameras 31 to 34 with a request for distributing of the respective images.
  • the distribution device 30 receives the request for distributing the monitored images photographed with the four monitoring cameras 31 to 34 (S 117 ).
  • the distribution device 30 distributes each of the monitored images provided by the four monitoring cameras 31 to 34 to the information terminal device 10 (S 118 ).
  • the communication section 14 of the information terminal device 10 receives each of the monitored images provided from the monitoring cameras 31 to 34 (S 119 ).
  • the display control section 26 of the information terminal device 10 displays, on the display screen, the on-browser images obtained by assigning the monitored images received from the four monitoring cameras 31 to 34 to the respective corresponding frames (S 120 ). That is, displayed at the display section 12 is a four-split screen displaying each of the monitored images received from the four monitoring cameras 31 to 34 , as illustrated in FIG. 4 .
  • the control section 21 of the information terminal device 10 determines whether or not the enlargement operation (drag operation) of enlarging the one divided region has been performed (S 121 ). In a case where the enlargement operation (drag operation) has been performed (“Yes” in S 121 ), the control section 21 performs display change processing following the enlargement operation (S 122 ).
  • control section 21 of the information terminal device 10 performs notification of the enlargement operation (drag operation) to the application server 40 (S 201 ).
  • the remaining region specification section 23 specifies the remaining region A 1 on the display screen of the display section 12 excluding the one divided region (the divided region D 1 in FIG. 5 ) enlarged through the enlargement operation (drag operation) of the touch panel 13 by the user.
  • the arrangement region determination section 24 determines, in the remaining region A 1 specified by the remaining region specification section 23 , the arrangement region A 3 for arranging all the other divided regions (the divided regions D 2 to D 4 of FIG. 6 ) other than the one enlarged divided region (the divided region D 1 of FIG. 6 ).
  • the reduced arrangement section 25 arranges, on a reduced scale in the arrangement region A 3 determined by the arrangement region determination section 24 , each of the other divided regions (the divided regions D 2 to D 4 of FIG. 6 ).
  • the enlargement operation notification indicating a changed layout determined in the aforementioned manner is outputted to the application server 40 .
  • the application server 40 receives the enlargement operation (drag operation) notification (S 202 ), executes event occurrence table image ETI update processing of adding the enlargement operation mark EX at a portion corresponding to a period of reception of the aforementioned enlargement operation at the photographing time axis of the event occurrence table image ETI (S 202 A), and transmits the screen information for displaying the changed screen frame subjected to the enlargement operation (S 203 ).
  • the communication section 14 of the information terminal device 10 receives the screen information for displaying the changed screen frame subjected to the enlargement operation (S 204 ).
  • the control section 21 of the information terminal device 10 provides the distribution device 30 with a request for the image. More specifically, the control section 21 of the information terminal device 10 provides the four monitoring cameras 31 to 34 with a request for distributing the images by use of the IP addresses of the four monitoring cameras 31 to 34 corresponding to the respective frames included in the screen information for displaying one changed screen frame (S 205 ).
  • the four monitoring cameras 31 to 34 receive the request for distributing the images (S 206 ) and distribute the images (S 207 ).
  • the communication section 14 of the information terminal device 10 receives the images from the four monitoring cameras 31 to 34 (S 208 ).
  • the display control section 26 of the information terminal device 10 assigns the images received from the four monitoring cameras 31 to 34 to the changed screen frame subjected to the enlargement operation and displays, on an enlarged scale in the one enlarged divided region (the divided region D 1 of FIG. 7 ), the image of this one divided region (that is, the image of the subject to be monitored (for example, illustrated in the circle)), and upon judgment by the image judgment section 27 that the subject to be monitored (for example, illustrated in the circle) is included in the images of the other divided regions (the divided regions D 2 to D 4 of FIG.
  • the display control section 26 displays, on the display screen, the changed on-browser image on which the image of the specific portion SE including the subject to be monitored (for example, illustrated in the circle) is displayed on an enlarged scale (S 209 ).
  • the image of the monitoring camera 31 (that is, the subject to be monitored (for example, illustrated in the circle)) is displayed on an enlarged scale in the one divided region D 1 enlarged on the display screen of the display section 12
  • the image of the specific portion SE of the monitored images of the monitoring cameras 32 to 34 (that is, the subject to be monitored (for example, illustrated in the circle)) is displayed on an enlarged scale in the other divided regions D 2 to D 4 reduced on the display screen of the display section 12 upon every judgment by the image judgment section 27 that the subject is included in the images of the other divided regions D 2 to D 4 .
  • the determination section 28 of the information terminal device 10 determines whether or not a ratio of the empty region A 2 , obtained by excluding the arrangement region A 3 determined by the arrangement region determination section 24 from the remaining region A 1 specified by the remaining region specification section 23 , occupying the display screen is equal to or greater than the specific ratio (for example, 10%) (S 210 ). Upon determination by the determination section 28 that the aforementioned ratio is equal to or greater than the specific ratio (for example, 10%) (“Yes” in S 210 ), the control section 21 of the information terminal device 10 provides the application server 40 with a request for displaying the type display image MI and the event occurrence table image ETI (S 211 ).
  • the application server 40 receives the request for displaying the type display image MI and the event occurrence table image ETI (S 211 ), and transmits the type display image MI, the event occurrence table image ETI, and the corresponding display image to the information terminal device 10 (S 213 ).
  • the communication section 14 of the information terminal device 10 receives the type display image MI, the event occurrence table image ETI, and the correspondence display image (S 214 ).
  • the display control section 26 of the information terminal device 10 displays, in the empty region A 2 on the display screen of the display section 12 , the type display image MI related to the images of the four divided regions, the event occurrence table image ETI indicating event EV occurrence timing, and the corresponding display image (S 215 ), ending the present processing.
  • the display control section 26 displays, in the empty region A 2 , the type display image MI related to the images of the plurality of divided regions and the event occurrence table image ETI indicating the event EV occurrence timing, as illustrated in FIG. 7 .
  • the control section 21 of the information terminal device 10 Upon determination by the determination section 28 that the ratio of the empty region A 2 occupying the display screen is less than the specific ratio (for example, 10%) (“No in S 210 ), the control section 21 of the information terminal device 10 provides the application server 40 with a request for displaying only the event occurrence table image ETI (S 217 ).
  • the application server 40 receives the request for displaying the event occurrence table image ETI (S 218 ) and transmits the event occurrence table image ETI and the corresponding display image to the information terminal device 10 (S 219 ).
  • the communication section 14 of the information terminal device 10 receives the event occurrence table image ETI and the corresponding display image (S 220 ). As illustrated in FIG.
  • the display control section 26 of the information terminal device 10 displays the event occurrence table image ETI indicating the event EV occurrence timing and the corresponding display image in the empty region A 2 on the display screen of the display section 12 (S 221 ), ending the present processing.
  • the display control section 26 displays the event occurrence table image ETI in the empty region A 2 without displaying the type display image MI, as illustrated in FIG. 8 .
  • the control section 21 of the information terminal device 10 determines whether or not ending of the display of monitoring camera viewing has been received (S 123 ). Upon determination that the display has not yet ended (“No” in S 123 ), the control section 21 returns to S 121 and upon determination that the display has ended (“Yes” in S 123 ), the control section 21 ends the present processing.
  • the display control section 26 displays, on an enlarged scale in the enlarged divided region D 1 , the image of the one divided region (the monitored image of the monitoring camera 31 ) and also displays, on an enlarged scale in the divided regions D 2 to D 4 arranged on a reduced scale by the reduced arrangement section 25 , the images of the specific portions SE of the other divided regions D 2 to D 4 (the monitored images of the specific portions SE in the monitoring cameras 32 to 34 ), as illustrated in FIG. 7 .
  • the screen information for displaying the four screen frames or the changed screen frame, the type display image MI, the event occurrence table image ETI, and the corresponding display image are received from the application server 40 , but the information terminal device 10 may determine the aforementioned screen information and also possess or acquire from the distribution device 30 the type display image MI, the event occurrence table image ETI, and the corresponding display image to thereby perform only login and authentication on the application server 40 .
  • the event EV is detected on the information terminal device 10 side in the embodiment described above. Specifically, the image judgment section 27 judges that the subject to be monitored is included in the images of the divided regions D 2 to D 4 , and the display control section 26 displays the event EV at a portion of the event occurrence table image ETI corresponding to a time point at which the aforementioned judgment has been performed. On the contrary, as in the modified example illustrated in FIG. 9 , the event EV may be detected on the application server 40 side.
  • the application server 40 executes event detection processing illustrated in FIG. 9 . It is assumed that the application server 40 receives enlargement operation notification (notification of enlargement operation performed on the divided region D 1 as is the case with the embodiment described above) as illustrated in S 202 of FIG. 3 .
  • the application server 40 provides the four monitoring cameras 31 to 34 with a request for the images (S 301 ).
  • the distribution device 30 receives the request for the images from the application server 40 (S 302 ).
  • the distribution device 30 distributes the respective monitored images provided by the four monitoring cameras 31 to 34 to the application server 40 (S 303 ).
  • the application server 40 receives the respective monitored images from the four monitoring cameras 31 to 34 (S 304 ).
  • the application server 40 performs image recognition processing on the respective monitored images received from the four monitoring cameras 31 to 34 (S 305 ). More specifically, the image judgment section included in the application server 40 performs the image recognition processing on the enlarged monitored image of the monitoring camera 31 and specifies, as the subject to be monitored, the subject included in the image of this enlarged monitored image.
  • the image judgment section included in the application server 40 judges, for each of the monitoring cameras 32 to 34 , whether or not the subject to be monitored is included in each of the monitored images provided from the other monitoring cameras 32 to 34 . That is, the application server 40 performs, for each of the monitored images provided from the other monitoring cameras 32 to 34 , event detection of the event EV including the subject to be monitored (S 306 ).
  • the application server 40 Upon judgment that the event detection has been performed (“Yes” in S 306 ), the application server 40 creates the event occurrence table image ETI added with the event EV on which the event detection has been performed (S 307 ).
  • the event occurrence table image ETI created here is transmitted to the information terminal device 10 in the aforementioned S 213 and S 219 illustrated in FIG. 3 .
  • the user can view each of the events EV of the event occurrence table image ETI displayed in the empty region A 2 on the display screen of the display section 12 to thereby recognize that the event EV has occurred, as illustrated in FIGS. 7 and 8 .
  • the application server 40 determines whether or not the image distribution has ended (S 308 ), and returns to S 305 in a case where the image distribution has not ended (“No” in S 308 ). On the other hand, in a case where the image distribution has ended (“Yes” in S 308 ), the application server 40 ends the present processing.
  • the aforementioned events EV may include: in addition to the appearance of the aforementioned subject to be monitored (for example, illustrated in the circle), appearance, exit, takeaway, and leaving of a target such as an object or a product.
  • a target such as an object or a product.
  • detection of the leaving of the target can be achieved through image recognition that the image of the target has disappeared from the monitored image.
  • the detection of the takeaway of the target can be achieved through image recognition that the image of the target has appeared in the monitored image.
  • the aforementioned events EV may include: for example, entrance, leaving, etc.
  • a change in a detection state of movement of the subject for example, detection of suspicious movement through automatic tracking of the subject as a result of performing, for example, face recognition processing on the monitored image photographed by the monitoring camera
  • activation and stopping of the image processing system 100 start and stopping of recording
  • a change in detection by an external sensor for example, detection of a toner-cover open state in a state in which no error is occurring.
  • the distribution device 30 may execute the image recognition processing on the monitored image to thereby detect the aforementioned event, information associating the image distribution with the event information may be distributed from the distribution device 30 to the application server 40 , and the application server 40 may recognize the event occurrence in the image distribution based on the received event information.
  • the method of display by the information terminal device 10 described in the embodiment above can be provided as a program.
  • This program is recorded into a computer-readable non-transitory recording medium, for example, a hard disc, a CD-ROM, a DVD-ROM, or a semiconductor memory.
  • the computer-readable non-transitory recording medium storing the aforementioned program forms one embodiment of this disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Closed-Circuit Television Systems (AREA)
US16/238,048 2018-01-12 2019-01-02 Information terminal device Active 2039-07-27 US10956763B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018-003801 2018-01-12
JPJP2018-003801 2018-01-12
JP2018003801A JP6982277B2 (ja) 2018-01-12 2018-01-12 情報端末装置、情報処理システムおよび表示制御プログラム

Publications (2)

Publication Number Publication Date
US20190220681A1 US20190220681A1 (en) 2019-07-18
US10956763B2 true US10956763B2 (en) 2021-03-23

Family

ID=67165357

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/238,048 Active 2039-07-27 US10956763B2 (en) 2018-01-12 2019-01-02 Information terminal device

Country Status (3)

Country Link
US (1) US10956763B2 (zh)
JP (1) JP6982277B2 (zh)
CN (1) CN110007832B (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061531B (zh) * 2019-12-10 2022-07-15 维沃移动通信有限公司 一种图片显示方法及电子设备
JP2023136133A (ja) 2022-03-16 2023-09-29 スズキ株式会社 自動車用ディスプレイ
JP2023136135A (ja) 2022-03-16 2023-09-29 スズキ株式会社 自動車用ディスプレイ

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090244608A1 (en) * 2008-03-27 2009-10-01 Seiko Epson Corporation Image-Output Control Device, Method of Controlling Image-Output, Program for Controlling Image-Output, and Printing Device
US20100066822A1 (en) * 2004-01-22 2010-03-18 Fotonation Ireland Limited Classification and organization of consumer digital images using workflow, and face detection and recognition
US20100321533A1 (en) * 2009-06-23 2010-12-23 Samsung Electronics Co., Ltd Image photographing apparatus and method of controlling the same
US20130147731A1 (en) * 2011-12-12 2013-06-13 Sony Mobile Communications Japan, Inc. Display processing device
JP2014006914A (ja) 2011-07-29 2014-01-16 Canon Marketing Japan Inc 情報処理装置、制御方法、プログラム、および情報処理システム
US20150264253A1 (en) * 2014-03-11 2015-09-17 Canon Kabushiki Kaisha Display control apparatus and display control method
US20150317026A1 (en) * 2012-12-06 2015-11-05 Samsung Electronics Co., Ltd. Display device and method of controlling the same

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002055753A (ja) * 2000-08-10 2002-02-20 Canon Inc 情報処理装置、機能一覧表表示方法、及び記憶媒体
JP4572099B2 (ja) * 2004-09-13 2010-10-27 三菱電機株式会社 移動体追跡支援システム
JP2008289091A (ja) * 2007-05-21 2008-11-27 Canon Inc 画像表示装置及び監視カメラシステム
JP5585506B2 (ja) * 2011-03-18 2014-09-10 セイコーエプソン株式会社 プログラム、情報記憶媒体、端末装置、及び表示システム
JP6025482B2 (ja) * 2012-09-28 2016-11-16 富士ゼロックス株式会社 表示制御装置、画像表示装置、およびプログラム
JP6873397B2 (ja) * 2016-04-07 2021-05-19 カシオ計算機株式会社 画像表示装置、画像表示制御方法及びプログラム
JP2019036872A (ja) * 2017-08-17 2019-03-07 パナソニックIpマネジメント株式会社 捜査支援装置、捜査支援方法及び捜査支援システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100066822A1 (en) * 2004-01-22 2010-03-18 Fotonation Ireland Limited Classification and organization of consumer digital images using workflow, and face detection and recognition
US20090244608A1 (en) * 2008-03-27 2009-10-01 Seiko Epson Corporation Image-Output Control Device, Method of Controlling Image-Output, Program for Controlling Image-Output, and Printing Device
US20100321533A1 (en) * 2009-06-23 2010-12-23 Samsung Electronics Co., Ltd Image photographing apparatus and method of controlling the same
JP2014006914A (ja) 2011-07-29 2014-01-16 Canon Marketing Japan Inc 情報処理装置、制御方法、プログラム、および情報処理システム
US20130147731A1 (en) * 2011-12-12 2013-06-13 Sony Mobile Communications Japan, Inc. Display processing device
US20150317026A1 (en) * 2012-12-06 2015-11-05 Samsung Electronics Co., Ltd. Display device and method of controlling the same
US20150264253A1 (en) * 2014-03-11 2015-09-17 Canon Kabushiki Kaisha Display control apparatus and display control method

Also Published As

Publication number Publication date
JP2019125053A (ja) 2019-07-25
US20190220681A1 (en) 2019-07-18
CN110007832A (zh) 2019-07-12
JP6982277B2 (ja) 2021-12-17
CN110007832B (zh) 2022-05-03

Similar Documents

Publication Publication Date Title
US11250090B2 (en) Recommended content display method, device, and system
JP7488333B2 (ja) ビデオ検索方法、装置、端末、及び記憶媒体
KR102179958B1 (ko) LFD(large format display) 장치 및 그 제어 방법
US20170223422A1 (en) Comment-provided video generating apparatus and comment-provided video generating method
US10956763B2 (en) Information terminal device
JP2020530631A (ja) インタラクション位置決定方法、システム、記憶媒体、およびスマートデバイス
WO2022110819A1 (zh) 视频切换方法及装置
US11825177B2 (en) Methods, systems, and media for presenting interactive elements within video content
CN103703438A (zh) 基于注视的内容显示器
JP2015518580A (ja) 透明ディスプレイ装置及びそのディスプレイ方法
US9389703B1 (en) Virtual screen bezel
WO2019157870A1 (zh) 网页应用的访问方法、装置、存储介质及电子设备
KR102370699B1 (ko) 영상에 기반한 정보 획득 방법 및 장치
US9294670B2 (en) Lenticular image capture
US20150213784A1 (en) Motion-based lenticular image display
US20150121301A1 (en) Information processing method and electronic device
JP2021530070A (ja) 個人情報を共有する方法、装置、端末設備及び記憶媒体
US20150348114A1 (en) Information providing apparatus
US10877650B2 (en) Information terminal and non-transitory computer-readable recording medium with display control program recorded thereon
JP2016177614A (ja) 会議システム、情報処理装置、情報端末、及びプログラム
WO2019119643A1 (zh) 移动直播的互动终端、方法及计算机可读存储介质
US20210326010A1 (en) Methods, systems, and media for navigating user interfaces
CN112115341A (zh) 内容展示方法、装置、终端、服务器、***及存储介质
JP6617547B2 (ja) 画像管理システム、画像管理方法、プログラム
WO2023029237A1 (zh) 视频预览方法及终端

Legal Events

Date Code Title Description
AS Assignment

Owner name: KYOCERA DOCUMENT SOLUTIONS INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKASHITA, FUMIYA;HIRANUMA, YOICHI;SAKAGUCHI, SHOICHI;AND OTHERS;REEL/FRAME:047882/0275

Effective date: 20181226

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STCF Information on status: patent grant

Free format text: PATENTED CASE