WO2009021125A1 - Method and apparatus for selectively terminating an imaging device based on a location of an object - Google Patents

Method and apparatus for selectively terminating an imaging device based on a location of an object Download PDF

Info

Publication number
WO2009021125A1
WO2009021125A1 PCT/US2008/072498 US2008072498W WO2009021125A1 WO 2009021125 A1 WO2009021125 A1 WO 2009021125A1 US 2008072498 W US2008072498 W US 2008072498W WO 2009021125 A1 WO2009021125 A1 WO 2009021125A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
distance
imaging device
profile
detection module
Prior art date
Application number
PCT/US2008/072498
Other languages
French (fr)
Inventor
Dan Phillips
Mike Haigh
Original Assignee
Sony Computer Entertainment Europe
Sony Computer Entertainment America Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe, Sony Computer Entertainment America Inc. filed Critical Sony Computer Entertainment Europe
Publication of WO2009021125A1 publication Critical patent/WO2009021125A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the invention relates generally to terminating operation of an imaging device and, more particularly, to terminating operation of an imaging device based on a location of an object.
  • a method and apparatus detects an object; detects a measured distance between the object and an imaging device; matches the object with a profile; selects an allowable distance between the object and the imaging device; compares the allowable distance with the measured distance; and selectively reduces operation of the imaging device based on the allowable distance and the measured distance.
  • Figure 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented
  • Figure 2 is a simplified block diagram illustrating one embodiment in which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented;
  • Figure 3 is a simplified block diagram illustrating a system, consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object;
  • Figure 4 illustrates an exemplary record consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object
  • Figure 5 is a flow diagram consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • references to "electronic device” include a device such as a personal digital video recorder, digital audio player, gaming console, a set top box, a personal computer, a cellular telephone, a personal digital assistant, a specialized computer such as an electronic interface with an automobile, and the like.
  • references to "user” include an operator of electronic devices.
  • references to “content” include audio streams, images, video streams, photographs, graphical displays, text files, software applications, electronic messages, and the like.
  • “content” also refers to advertisements and programming.
  • FIG. 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented.
  • the environment includes an electronic device 110, e.g. a computing platform configured to act as a client device, such as a personal digital video recorder, digital audio player, computer, a personal digital assistant, a cellular telephone, a camera device, a set top box, a gaming console; a user interface 115, a network 120, e.g. a local area network, a home network, the Internet; and a server 130, e.g. a computing platform configured to act as a server.
  • the network 120 can be implemented via wireless or wired solutions.
  • one or more user interface 115 components are made integral with the electronic device 110, e.g. keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics, e.g. as in a Clie® manufactured by Sony Corporation.
  • one or more user interface 115 components e.g. a keyboard, a pointing device such as a mouse and trackball, a microphone, a speaker, a display, a camera, are physically separate from, and are conventionally coupled to, an electronic device 110.
  • the user uses the interface 115 components to access and control content and applications stored in an electronic device 110, server 130, or a remote storage device (not shown) coupled via the network 120.
  • embodiments for selectively terminating an imaging device based on a location of an object as described below are executed by an electronic processor in an electronic device 110, in a server 130, or by processors in the electronic device 110 and in the server 130 acting together.
  • the server 130 is illustrated in Figure 1 as a single computing platform, but in other instances it comprises two or more interconnected computing platforms that act as a server.
  • Figure 2 is a simplified diagram illustrating an exemplary architecture in which the method and apparatus for selectively terminating an imaging device based on a location of an object ais implemented.
  • the exemplary architecture includes a plurality of electronic devices 110, a server device 130, and a network 120 connecting the electronic devices 110 to a server 130, and connecting each electronic device 110 to each other.
  • the plurality of electronic devices 110 are each configured to include a computer-readable medium 209, such as random access memory, coupled to an electronic processor 208.
  • the processor 208 executes program instructions that are stored in the computer-readable medium 209.
  • a unique user operates each electronic device 110 via an interface 115, as described with reference to Figure 1.
  • the server device 130 includes a processor 211 that is coupled to a computer-readable medium 212.
  • the server device 130 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such as a database 240.
  • the processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, California. In other instances, other microprocessors are used.
  • the plurality of client devices 110 and the server 130 include instructions for a customized application for selectively terminating operation of an imaging device based on a location of an object.
  • the plurality of computer-readable media 209 and 212 contain, in part, the customized application.
  • the plurality of client devices 110 and the server 130 are configured to receive and transmit electronic messages for use with the customized application.
  • the network 120 is configured to transmit electronic messages for use with the customized application.
  • One or more user applications are stored in memories 209, in memory 211, or a single user application is stored in part in one memory 209 and in part in another memory 211.
  • a stored user application regardless of storage location, is made customizable based on automatically controlling the sound level based on the content as determined using embodiments described below.
  • Figure 3 illustrates one embodiment of a system 300 for selectively terminating an imaging device based on a location of an object.
  • the system 300 includes an object detection module 310, a distance detection module 320, a storage module 330, an interface module 340, a control module 350, a profile module 360, an object identification module 370, and a termination module 380.
  • the control module 350 communicates with the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380.
  • the control module 350 coordinates tasks, requests, and communications between the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380.
  • the object detection module 310 detects both living and inanimate objects.
  • the distance detection module 320 detects the distance between the object and the system 300. In one embodiment, the distance detection module 320 uses a camera to detect the distance.
  • the storage module 330 stores a plurality of profiles, wherein each profile is associated with various content and other data associated with the content or a viewer.
  • the profile stores exemplary information as shown in the profiles illustrated in Figure 4.
  • the storage module 330 is located within the server device 130. In another embodiment, portions of the storage module 330 are located within the electronic device 110.
  • the interface module 340 detects the electronic device 110 when the electronic device 110 is connected to the network 120.
  • the interface module 340 detects input from the interface device 115, such as a keyboard, a mouse, a microphone, a still camera, a video camera, and the like.
  • the interface module 340 provides output to the interface device115, such as a display, speakers, external storage devices, an external network, and the like.
  • the profile module 360 processes profile information related to the specific object. In one embodiment, exemplary profile information is shown within a record illustrated in Figure 4. In one embodiment, each profile corresponds with a particular object that is detected by the object detection module 310. In one embodiment, the object identification module 370 determines a match between the detected object and an object type. In one embodiment, a match between the detected object and the object type is determined by a match between the attributes associated with the detected object and the object type.
  • the information within the profile of the object type and detected object is used to determine the match.
  • the termination module 380 selectively shuts down an imaging device based on various parameters, such as the distance between the detected object and the imaging device, the object type, and the like.
  • the system 300 is configured to match the detected object with an object type. In another embodiment, the system 300 is also configured to detect the distance between the detected object and the imaging device. Further, the termination of the imaging device is initiated based on the object type of the detected object and the distance between the detected object and the imaging device.
  • the system 300 in Figure 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for selectively terminating an imaging device based on a location of an object. Additional modules may be added to the system 300 without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object. Similarly, modules may be combined or deleted without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • Figure 4 illustrates a simplified record 400 that corresponds to a profile that describes different object types.
  • the record 400 is stored within the storage module 330 and used within the system 300.
  • the record 400 includes an object identification field 405, an object description field 410, an object image field 415, and an object classification field 420.
  • the object identification field 405 identifies a specific object associated with the record 400.
  • the object's name is used as a label for the object identification field 405.
  • the object description field 410 includes a description of the object.
  • the object description field 410 may include a detailed written description of the car indicating attributes of the car, such as size of the car, profile of the car, color of the car. Different levels of details may be included within the object description field 410.
  • a narrative or summary of the object may be included within the content description field 410.
  • the object image field 415 identifies a graphic that describes the content.
  • the graphic is a representation of the object.
  • the graphic is the actual image of the object.
  • the object classification field 420 identifies an object type associated with the particular object.
  • multiple distinct objects may be included within a single object type.
  • different object types include living objects and inanimate objects. Within living objects, there may be sub-types, such as humans and animals. In one embodiment, there may be any number of object types and object sub-types to classify various objects.
  • the flow diagram in Figure 5 illustrates one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object. The blocks within the flow diagram can be performed in a different sequence without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object. Further, blocks can be deleted, added, or combined without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • the flow diagram in Figure 5 illustrates reducing operation of an imaging device based on multiple parameters according to one embodiment of the invention.
  • an object is detected.
  • the object is detected through the object detection module 310.
  • the object is detected through an imaging device.
  • a distance between the detected object and the imaging device is detected.
  • the distance detection module 320 detects the distance between the detected object and the imaging device.
  • the imaging device is used to detect this distance.
  • an object type is detected based on the object profile.
  • various parameters of the object are detected and matched with a specific object profile.
  • An exemplary object profile is illustrated as a record 400 within Figure 4.
  • a match is performed between the object, as detected within the block 505, and the object profile, as identified within the block 515.
  • an image of the detected object is compared with the image contained within the object image field 415 within the record 400.
  • the parameters of the detected object are compared with the description contained within the object description field 410 within the record 400.
  • a match between the detected object and the object profile may be determined through a match threshold that satisfies a minimum level of matching to proceed.
  • a predetermined default allowable distance is used within the block 535.
  • the predetermined default allowable distance is two inches. In other embodiments, the predetermined default allowable distance can be any predetermined distance.
  • the allowable distance is determined through the associated object profile within the block 525, as shown within the exemplary record 400.
  • the object classification field 420 of the record 400 provides an allowable distance.
  • the object classification field 420 of the record 400 provides a range of allowable distances.
  • the distance between the detected object and the imaging device is compared with the either the predetermined default allowable distance from the block 535 or the allowable distance from the block 525.
  • block 540 if the detected object is outside the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then the imaging device continues to operate. Further, detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505. In block 540, if the detected object is within the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then operation of the imaging device is modified within the block 545.
  • operation of the imaging device is reduced. In one embodiment, the output or emissions from the imaging device is reduced. In another embodiment, the output or emissions from the imaging device is terminated.
  • the output or emissions from the imaging device may harm living objects nearby. In other instances, the output or emissions from the image device may be annoying or affect the comfort of those nearby.
  • a reduction in output or emissions from the imaging device may be sufficient to ensure safety and comfort for those that are nearby.
  • a termination in output or emissions from the imaging device may be needed to ensure safety and comfort for those that are nearby.
  • the detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In one embodiment, a method and apparatus detects an object; detects a measured distance between the object and an imaging device; matches the object with a profile; selects an allowable distance between the object and the imaging device; compares the allowable distance with the measured distance; and selectively reduces operation of the imaging device based on the allowable distance and the measured distance.

Description

METHOD AND APPARATUS FOR SELECTIVELY TERMINATING
AN IMAGING DEVICE BASED ON A LOCATION OF AN OBJECT
CROSS-REFERENCE TO RELATED APPLICATIONS This patent application claims the benefit of U.S. provisional patent application serial number 60/963,868, filed August 7, 2007, the entirety of which is hereby incorporated by this reference thereto.
FIELD OF THE INVENTION The invention relates generally to terminating operation of an imaging device and, more particularly, to terminating operation of an imaging device based on a location of an object.
SUMMARY In one embodiment, a method and apparatus detects an object; detects a measured distance between the object and an imaging device; matches the object with a profile; selects an allowable distance between the object and the imaging device; compares the allowable distance with the measured distance; and selectively reduces operation of the imaging device based on the allowable distance and the measured distance. BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented; Figure 2 is a simplified block diagram illustrating one embodiment in which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented;
Figure 3 is a simplified block diagram illustrating a system, consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object;
Figure 4 illustrates an exemplary record consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object; and
Figure 5 is a flow diagram consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object.
DETAILED DESCRIPTION
The following detailed description of the method and apparatus for selectively terminating an imaging device based on a location of an object refers to the accompanying drawings. The detailed description is not intended to limit the methods and apparatuses for selectively terminating an imaging device based on a location of an object.
Instead, the scope of the method and apparatus for automatically selecting a profile is defined by the appended claims and equivalents. Those skilled in the art will recognize that many other implementations are possible, consistent with the methods and apparatuses for selectively terminating an imaging device based on a location of an object.
References to "electronic device" include a device such as a personal digital video recorder, digital audio player, gaming console, a set top box, a personal computer, a cellular telephone, a personal digital assistant, a specialized computer such as an electronic interface with an automobile, and the like.
References to "user" include an operator of electronic devices.
References to "content" include audio streams, images, video streams, photographs, graphical displays, text files, software applications, electronic messages, and the like. In another embodiment, "content" also refers to advertisements and programming.
Figure 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented. The environment includes an electronic device 110, e.g. a computing platform configured to act as a client device, such as a personal digital video recorder, digital audio player, computer, a personal digital assistant, a cellular telephone, a camera device, a set top box, a gaming console; a user interface 115, a network 120, e.g. a local area network, a home network, the Internet; and a server 130, e.g. a computing platform configured to act as a server. In one embodiment, the network 120 can be implemented via wireless or wired solutions.
In one embodiment, one or more user interface 115 components are made integral with the electronic device 110, e.g. keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics, e.g. as in a Clie® manufactured by Sony Corporation. In other embodiments, one or more user interface 115 components, e.g. a keyboard, a pointing device such as a mouse and trackball, a microphone, a speaker, a display, a camera, are physically separate from, and are conventionally coupled to, an electronic device 110. The user uses the interface 115 components to access and control content and applications stored in an electronic device 110, server 130, or a remote storage device (not shown) coupled via the network 120.
In accordance with the invention, embodiments for selectively terminating an imaging device based on a location of an object as described below are executed by an electronic processor in an electronic device 110, in a server 130, or by processors in the electronic device 110 and in the server 130 acting together. The server 130 is illustrated in Figure 1 as a single computing platform, but in other instances it comprises two or more interconnected computing platforms that act as a server. Figure 2 is a simplified diagram illustrating an exemplary architecture in which the method and apparatus for selectively terminating an imaging device based on a location of an object ais implemented. The exemplary architecture includes a plurality of electronic devices 110, a server device 130, and a network 120 connecting the electronic devices 110 to a server 130, and connecting each electronic device 110 to each other. The plurality of electronic devices 110 are each configured to include a computer-readable medium 209, such as random access memory, coupled to an electronic processor 208. The processor 208 executes program instructions that are stored in the computer-readable medium 209. A unique user operates each electronic device 110 via an interface 115, as described with reference to Figure 1.
The server device 130 includes a processor 211 that is coupled to a computer-readable medium 212. In one embodiment, the server device 130 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such as a database 240. In one instance, the processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, California. In other instances, other microprocessors are used.
The plurality of client devices 110 and the server 130 include instructions for a customized application for selectively terminating operation of an imaging device based on a location of an object. In one embodiment, the plurality of computer-readable media 209 and 212 contain, in part, the customized application. Additionally, the plurality of client devices 110 and the server 130 are configured to receive and transmit electronic messages for use with the customized application. Similarly, the network 120 is configured to transmit electronic messages for use with the customized application.
One or more user applications are stored in memories 209, in memory 211, or a single user application is stored in part in one memory 209 and in part in another memory 211. In one instance, a stored user application, regardless of storage location, is made customizable based on automatically controlling the sound level based on the content as determined using embodiments described below.
Figure 3 illustrates one embodiment of a system 300 for selectively terminating an imaging device based on a location of an object. The system 300 includes an object detection module 310, a distance detection module 320, a storage module 330, an interface module 340, a control module 350, a profile module 360, an object identification module 370, and a termination module 380. In one embodiment, the control module 350 communicates with the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380. In one embodiment, the control module 350 coordinates tasks, requests, and communications between the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380.
In one embodiment, the object detection module 310 detects both living and inanimate objects.
In one embodiment, the distance detection module 320 detects the distance between the object and the system 300. In one embodiment, the distance detection module 320 uses a camera to detect the distance.
In one embodiment, the storage module 330 stores a plurality of profiles, wherein each profile is associated with various content and other data associated with the content or a viewer. In one embodiment, the profile stores exemplary information as shown in the profiles illustrated in Figure 4. In one embodiment, the storage module 330 is located within the server device 130. In another embodiment, portions of the storage module 330 are located within the electronic device 110.
In one embodiment, the interface module 340 detects the electronic device 110 when the electronic device 110 is connected to the network 120.
In another embodiment, the interface module 340 detects input from the interface device 115, such as a keyboard, a mouse, a microphone, a still camera, a video camera, and the like.
In yet another embodiment, the interface module 340 provides output to the interface device115, such as a display, speakers, external storage devices, an external network, and the like. In one embodiment, the profile module 360 processes profile information related to the specific object. In one embodiment, exemplary profile information is shown within a record illustrated in Figure 4. In one embodiment, each profile corresponds with a particular object that is detected by the object detection module 310. In one embodiment, the object identification module 370 determines a match between the detected object and an object type. In one embodiment, a match between the detected object and the object type is determined by a match between the attributes associated with the detected object and the object type.
In one embodiment, the information within the profile of the object type and detected object is used to determine the match. In one embodiment, the termination module 380 selectively shuts down an imaging device based on various parameters, such as the distance between the detected object and the imaging device, the object type, and the like.
In one embodiment, the system 300 is configured to match the detected object with an object type. In another embodiment, the system 300 is also configured to detect the distance between the detected object and the imaging device. Further, the termination of the imaging device is initiated based on the object type of the detected object and the distance between the detected object and the imaging device. The system 300 in Figure 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for selectively terminating an imaging device based on a location of an object. Additional modules may be added to the system 300 without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object. Similarly, modules may be combined or deleted without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object.
Figure 4 illustrates a simplified record 400 that corresponds to a profile that describes different object types. In one embodiment, the record 400 is stored within the storage module 330 and used within the system 300. In one embodiment, the record 400 includes an object identification field 405, an object description field 410, an object image field 415, and an object classification field 420.
In one embodiment, the object identification field 405 identifies a specific object associated with the record 400. In one example, the object's name is used as a label for the object identification field 405.
In one embodiment, the object description field 410 includes a description of the object. In one example, if the object is an inanimate object, such as a car, then the object description field 410 may include a detailed written description of the car indicating attributes of the car, such as size of the car, profile of the car, color of the car. Different levels of details may be included within the object description field 410. In one embodiment, a narrative or summary of the object may be included within the content description field 410.
In one embodiment, the object image field 415 identifies a graphic that describes the content. In one embodiment, the graphic is a representation of the object. In another embodiment, the graphic is the actual image of the object.
In one embodiment, the object classification field 420 identifies an object type associated with the particular object. In one embodiment, multiple distinct objects may be included within a single object type. For example, different object types include living objects and inanimate objects. Within living objects, there may be sub-types, such as humans and animals. In one embodiment, there may be any number of object types and object sub-types to classify various objects. The flow diagram in Figure 5 illustrates one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object. The blocks within the flow diagram can be performed in a different sequence without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object. Further, blocks can be deleted, added, or combined without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object.
The flow diagram in Figure 5 illustrates reducing operation of an imaging device based on multiple parameters according to one embodiment of the invention.
In block 505, an object is detected. In one embodiment, the object is detected through the object detection module 310. In another embodiment, the object is detected through an imaging device. In block 510, a distance between the detected object and the imaging device is detected. In one embodiment, the distance detection module 320 detects the distance between the detected object and the imaging device. In one embodiment, the imaging device is used to detect this distance.
In block 515, an object type is detected based on the object profile. In one embodiment, various parameters of the object are detected and matched with a specific object profile. An exemplary object profile is illustrated as a record 400 within Figure 4.
In block 520, a match is performed between the object, as detected within the block 505, and the object profile, as identified within the block 515. In one embodiment, an image of the detected object is compared with the image contained within the object image field 415 within the record 400. In another embodiment, the parameters of the detected object are compared with the description contained within the object description field 410 within the record 400.
In one embodiment, a match between the detected object and the object profile may be determined through a match threshold that satisfies a minimum level of matching to proceed.
If there is no match in the block 520, then a predetermined default allowable distance is used within the block 535. In one embodiment, the predetermined default allowable distance is two inches. In other embodiments, the predetermined default allowable distance can be any predetermined distance.
If there is a match in the block 520, then the allowable distance is determined through the associated object profile within the block 525, as shown within the exemplary record 400. In one embodiment, the object classification field 420 of the record 400 provides an allowable distance. In another embodiment, the object classification field 420 of the record 400 provides a range of allowable distances. In block 530, the distance between the detected object and the imaging device is compared with the either the predetermined default allowable distance from the block 535 or the allowable distance from the block 525.
In block 540, if the detected object is outside the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then the imaging device continues to operate. Further, detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505. In block 540, if the detected object is within the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then operation of the imaging device is modified within the block 545.
In block 545, operation of the imaging device is reduced. In one embodiment, the output or emissions from the imaging device is reduced. In another embodiment, the output or emissions from the imaging device is terminated.
In use, by reducing or terminating the output or emissions from the imaging device, objects that are nearby are no longer subject to the full output from the imaging device. In some instances, the output or emissions from the imaging device may harm living objects nearby. In other instances, the output or emissions from the image device may be annoying or affect the comfort of those nearby.
In some instances, a reduction in output or emissions from the imaging device may be sufficient to ensure safety and comfort for those that are nearby. In other instances, a termination in output or emissions from the imaging device may be needed to ensure safety and comfort for those that are nearby.
After the reduction or termination in operation of the imaging device in block 545, the detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505.
The foregoing descriptions of specific embodiments of the invention have been presented for purposes of illustration and description. For example, the invention is described within the context of dynamically detecting and generating image information as merely one embodiment of the invention. The invention may be applied to a variety of other applications.
They are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed, and naturally many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best use the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents.

Claims

Claims
1. A computer-implemented method for controlling operation of an imaging device comprising the steps of: detecting an object; detecting a measured distance between the object and an imaging device; matching the object with a profile; selecting an allowable distance between the object and the imaging device; comparing the allowable distance with the measured distance; and selectively controlling operation of the imaging device based on said comparing the allowable distance and the measured distance.
2. The method of Claim 1 , said controlling step selectively terminating operation of the imaging device based on a location of said object.
3. The method of Claim 1 , further comprising the steps of: providing an object detection module, a distance detection module, a storage module, an interface module, a control module for communicating with said object detection module, said distance detection module, said storage module, said interface module, said profile module, said object identification module, and said termination module wherein said control module coordinates tasks, requests, and communications between said object detection module, said distance detection module, said storage module, said interface module, said profile module, said object identification module, and said termination module, a profile module, an object identification module, and a termination module.
4. The method of Claim 3, said object detection module detecting both living and inanimate objects.
5. The method of Claim 3, said distance detection module detecting the distance between the object and a system.
6. The method of Claim 3, said distance detection module using a camera to detect distance.
7. The method of Claim 3, said storage module storing a plurality of profiles, wherein each profile is associated with various content and other data associated with the content or a viewer.
8. The method of Claim 3, said interface module detecting an electronic device when the electronic device is connected to a network.
9. The method of Claim 3, said interface module detecting input from an interface device comprising any of a keyboard, a mouse, a microphone, a still camera, and a video camera.
10. The method of Claim 3, said interface module providing output to an interface device comprising any of a display, speakers, external storage devices, and an external network.
11. The method of Claim 3, said profile module processing profile information related to a specific object, wherein each profile corresponds with a particular object that is detected by said object detection module.
12. The method of Claim 3, said object identification module determining a match between a detected object and an object type, wherein said match between said detected object and said object type is determined by any of a match between attributes associated with said detected object and said object type, and information within a profile of an object type and said detected object.
13. The method of Claim 3, said termination module selectively shutting down an imaging device based on at least one parameter comprising any distance between the detected object and said imaging device and the object type.
14. The method of Claim 3, wherein termination of said imaging device is initiated based on object type of said detected object and distance between said detected object and said imaging device.
15. The method of Claim 1 , further comprising the step of: providing a record that corresponds to a profile that describes different object types.
16. The method of Claim 15, wherein said record comprises an object identification field, an object description field, an object image field, and an object classification field.
17. The method of Claim 16, wherein said object identification field identifies a specific object associated with said record.
18. The method of Claim 17, wherein said object's name is used as a label for said object identification field.
19. The method of Claim 17, wherein said object description field comprises a description of said object.
20. The method of Claim 17, wherein said object image field identifies a graphic that describes content, and/or is a representation of said object, and/or is an actual image of said object.
21. The method of Claim 17, wherein said object classification field identifies an object type associated with a particular object.
22. The method of Claim 17, wherein multiple distinct objects may be included within a single object type.
23. The method of Claim 17, wherein said object classification field of said record provides either of an allowable distance and a range of allowable distances.
24. The method of Claim 1 , wherein said controlling step reduces or terminates output or emissions from said imaging devices.
25. An apparatus for controlling operation of an imaging device comprising: a detection module for detecting an object; a distance detection module for detecting a measured distance between the object and an imaging device; an object identification module for matching the object with a profile; a profile module for selecting an allowable distance between the object and the imaging device from said profile; a control module for comparing the allowable distance with the measured distance; and a termination module for selectively controlling operation of the imaging device based on said comparing the allowable distance and the measured distance.
PCT/US2008/072498 2007-08-07 2008-08-07 Method and apparatus for selectively terminating an imaging device based on a location of an object WO2009021125A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US96386807P 2007-08-07 2007-08-07
US60/963,868 2007-08-07
US18794108A 2008-08-07 2008-08-07
US12/187,941 2008-08-07

Publications (1)

Publication Number Publication Date
WO2009021125A1 true WO2009021125A1 (en) 2009-02-12

Family

ID=40341742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/072498 WO2009021125A1 (en) 2007-08-07 2008-08-07 Method and apparatus for selectively terminating an imaging device based on a location of an object

Country Status (1)

Country Link
WO (1) WO2009021125A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9929357B2 (en) 2014-07-22 2018-03-27 Universal Display Corporation Organic electroluminescent materials and devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040637A1 (en) * 1996-03-28 2001-11-15 Iwao Ishida Detecting device reducing detection errors caused by signals generated other than by direct light
US20030027528A1 (en) * 2001-08-06 2003-02-06 Toshiba Tec Kabushiki Kaisha. Image information input/output device and control system for the same using mobile device
US20050282530A1 (en) * 2002-06-11 2005-12-22 Adam Raff Communications device and method comprising user profiles matching between compatible devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040637A1 (en) * 1996-03-28 2001-11-15 Iwao Ishida Detecting device reducing detection errors caused by signals generated other than by direct light
US20030027528A1 (en) * 2001-08-06 2003-02-06 Toshiba Tec Kabushiki Kaisha. Image information input/output device and control system for the same using mobile device
US20050282530A1 (en) * 2002-06-11 2005-12-22 Adam Raff Communications device and method comprising user profiles matching between compatible devices

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9929357B2 (en) 2014-07-22 2018-03-27 Universal Display Corporation Organic electroluminescent materials and devices

Similar Documents

Publication Publication Date Title
US20230274537A1 (en) Eye gaze tracking using neural networks
EP1782155B1 (en) Methods and apparatuses for automatically selecting a profile
US20090062943A1 (en) Methods and apparatus for automatically controlling the sound level based on the content
JP2007520830A (en) Method and apparatus for identifying opportunities to capture content
CN108257104B (en) Image processing method and mobile terminal
US20180108352A1 (en) Robot Interactive Communication System
CN112100431A (en) Evaluation method, device and equipment of OCR system and readable storage medium
EP2791829B1 (en) Method for rule-based context acquisition
JP2021530823A (en) Neural network training methods, line-of-sight tracking methods and devices, and electronic devices
CN113190646A (en) User name sample labeling method and device, electronic equipment and storage medium
US20200380168A1 (en) Image Access Management Device, Image Access Management Method, and Image Access Management System
US20170289676A1 (en) Systems and methods to identify device with which to participate in communication of audio data
CN113962404A (en) Information processing apparatus, information processing method, and information processing system
WO2009021125A1 (en) Method and apparatus for selectively terminating an imaging device based on a location of an object
US8001114B2 (en) Methods and apparatuses for dynamically searching for electronic mail messages
CN111045560A (en) Method for sending picture and electronic equipment
CN107609446B (en) Code pattern recognition method, terminal and computer readable storage medium
EP3937035A1 (en) Method and apparatus for preventing forgery of data, method and apparatus for detecting forgery of data
JP2022016527A (en) Information provision system
US7822764B2 (en) Methods and apparatuses for dynamically displaying search suggestions
US11599827B2 (en) Method and apparatus for improving the robustness of a machine learning system
CN113111692A (en) Target detection method and device, computer readable storage medium and electronic equipment
CN112926080A (en) Control method and device of privacy object, storage medium and electronic equipment
CN114007035B (en) Conference data sharing system and method
US11763042B2 (en) Computer program for preventing information spill displayed on display device and security service using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08797390

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08797390

Country of ref document: EP

Kind code of ref document: A1