WO2007075000A1 - Imaging device and method for transferring image signal - Google Patents

Imaging device and method for transferring image signal Download PDF

Info

Publication number
WO2007075000A1
WO2007075000A1 PCT/KR2006/005612 KR2006005612W WO2007075000A1 WO 2007075000 A1 WO2007075000 A1 WO 2007075000A1 KR 2006005612 W KR2006005612 W KR 2006005612W WO 2007075000 A1 WO2007075000 A1 WO 2007075000A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
image signal
signal
memory
Prior art date
Application number
PCT/KR2006/005612
Other languages
French (fr)
Inventor
Jong-Sik Jeong
Original Assignee
Mtekvision Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mtekvision Co., Ltd. filed Critical Mtekvision Co., Ltd.
Publication of WO2007075000A1 publication Critical patent/WO2007075000A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N2007/145Handheld terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/7921Processing of colour television signals in connection with recording for more than one processing mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • H04N9/8047Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction using transform coding

Definitions

  • the present invention is directed to an imaging device, more specifically to an
  • imaging device for selectively transferring an image signal (e.g. YUV data or encoded
  • portable terminals refer to portable terminals
  • Portable terminals include
  • PDA personal digital assistants
  • the mobile communication terminal is essentially a device designed to enable
  • FIG. 1 shows a block diagram of a conventional mobile communication
  • the high frequency processing unit 110 processes a high frequency signal
  • the analog-to-digital converter 115 converts an analog signal, outputted from
  • the high frequency processing unit 110 to a digital signal and sends to the processing
  • the digital-to-analog converter 120 converts a digital signal, outputted from
  • the processing unit 125 controls the general operation of the mobile
  • the processing unit 125 can comprise a central processing unit (CPU) or a micro-controller.
  • the power supply 130 supplies electric power required for operating the vehicle
  • the power supply 130 can be coupled to, for example
  • an external power source or a battery.
  • the key input 135 generates key data for, for example, setting various parameters
  • the main memory 140 stores an operating system and a variety of data of the
  • the main memory 140 can be, for example, a
  • the display 145 displays the operation status of the mobile communication
  • terminal 100 related information (e.g. data and time) and an external image
  • the camera 150 photographs an external image (a photographic subject), and
  • the image processing unit 155 processes the external image photographed by the camera
  • the image processing unit 155 can perform functions such as color interpolation,
  • the image processing unit 155 can include an image sensor, an image signal processor
  • the support memory 160 stores the external image processed by the image
  • the support memory 160 can be an SRAM (Static RAM) or an
  • SDRAM Serial DRAM
  • the mobile communication terminal 100 having a camera
  • a function is equipped with a plurality of processing units (that is, a main processor and
  • the processing unit 125 for controlling general functions of the
  • Each processing unit is structured to be coupled with
  • the main processor can be a baseband chip.
  • the application processor can take different forms and quantity depending on
  • the application processor for controlling the camera function can process
  • controlling the movie file playback function can process functions such as video file
  • controlling the music file playback function can process functions such as audio file
  • the portable terminal can also comprise an application
  • processors for controlling games.
  • Each of these processors has an individual memory for controlling
  • FIG. 2 is a coupling structure between a processor and a memory in accordance
  • a main processor 210 basically has two buses.
  • the bus refers to a common-purpose electric path used for transmitting
  • the bus includes a line for information, which represents an
  • One bus is an MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a MP (main processor)-AP (application processor) bus forming a
  • MP-MM main memory
  • the M-NV memory 220 is a nonvolatile memory
  • the M-VO memory 225 is a
  • the MP-MM bus can be classified into a first bus, which is coupled to
  • the M-NV memory 220 and a second bus, which is coupled to the M-VO memory 225.
  • the M-NV memory 220 and the M-VO memory 225 are embodied as one chip by
  • the application processor 215 is coupled to the main processor 210 through the
  • MP-AP bus and a second volatile memory 245 through an AP-AM (application
  • the A-VO memory 245 is a volatile memory. Also, the application
  • processor 215 is coupled to the display 145 and the image sensor 240 through additional buses.
  • each of the main processor 210 and the application processor 215 is equipped with a
  • main processor 210 displays data (e.g.
  • processor 210 must read respective data and transfer the read data to the application
  • the application processor 215 processes (e.g.
  • application processor 215 can store the respective data in the A-VO 245 and read the
  • application processor 215 uses the AP-AM bus to access the A-VO memory 245.
  • application processor 215 accesses the A-VO memory 245 to process data received
  • the present invention provides an
  • the present invention also provides an imaging device and an image signal
  • the present invention also provides an imaging device and an image signal
  • the present invention also provides an imaging device and an image signal generating method thereof that can maximize the efficiency of storing and processing
  • the present invention provides an imaging device and an image
  • an aspect of the present invention features an
  • image signal processor generating an image signal and a digital processing device
  • a device includes a main processor; a memory, where a storage area is partitioned into n
  • n being a natural number
  • the image signal is the encoded image data.
  • the application processor can have a multimedia data input unit, storing the
  • the memory and the application processor can be coupled to each other
  • the control signal which determines the type of the signal processor, can be
  • ISP image processor
  • a first input unit being inputted with the raw data
  • a second unit receiving a
  • control signal which instructs to generate an image signal of a particular type from a
  • a YUV data generating unit generating YUV data by using the raw data and outputting the YUV data
  • an encoding unit generating encoded image data
  • the control signal which determines the type of the signal processor, can be
  • an image signal e.g. YUV data or encoded data
  • generating method of a digital processing device includes an image sensor generating
  • a camera module generating an image signal corresponding to
  • the generated image signal the image signal being YUV data generated by use of the
  • processor determining the type of the image signal by using a control signal received
  • image signal is the YUV data and storing the generated encoded image data in the
  • control signal which determines the type of the signal processor, can be
  • the memory and the application processor can be coupled to each other
  • FIG. 1 is a block diagram illustrating a conventional mobile communication
  • FIG. 2 illustrates a coupling structure between a processor and a memory in
  • FIG. 3 illustrates a linking structure between each processor in accordance with
  • FIG. 4 is a flow chart illustrating a method of outputting various image signals
  • FIG. 5 is a flow chart illustrating a method of processing an inputted image
  • digital appliances such as the mobile communication terminal, PDA, portable multimedia player (PMP), MP3 player, digital camera, digital television, audio
  • the portable terminal will be
  • processors or elements having a memory shared by a plurality of processors or elements.
  • FIG. 3 illustrates a linking structure between each processor in accordance with
  • camera module 305 and the application processor 215 is a multimedia processor for
  • an image signal e.g. YUV data or encoded image data
  • the shared memory 310 coupled to the application processor
  • the shared memory 310 is shared by each element of the application processor 215 (e.g.
  • controller 255 controls the multimedia processing unit 260 and an image sealer 265).
  • the storage area of the shared memory 310 can be partitioned into n partitioned blocks, n being a natural number.
  • the use for each partitioned block can be
  • a first partitioned block can be predetermined for each of
  • a second partitioned block can be
  • a third partitioned block can be predetermined to store image
  • a preview mode i.e. a previewing state before photographing a
  • partitioned blocks completes storing a first part of the data, the other partitioned block
  • the shared memory 310 is coupled to the main processor 210 and used for
  • a particular partitioned block can be assigned for the data
  • the shared memory 310 further has an access port.
  • partitioned block for the data transmission is restricted to be accessed by a plurality of
  • processors at the same time.
  • image signal processor (ISP) 330 or the application processor 215 can generate data
  • the ISP 330 can be preset to perform encoding process. If the setting mode
  • the application processor 215 can be
  • the ISP 330 must have a
  • processing module for generating encoded data by use of YUV data (or RGB data), in
  • the ISP 330 can further include an input unit, which is inputted with raw data,
  • an image signal i.e. YUV data or encoded
  • the application processor 215 also must integrally or individually have
  • interface means that can receive not only YUV data but also encoded image data from
  • the camera module 305 This is because since the YUV data has a signal wave type in which only valid data is successively inputted, but the encoded data has another signal
  • the application processor 215 of the present invention is
  • an MP-AP bus e.g. a host interface
  • shared memory 310 having two ports through a first SM (shared memory) bus and a
  • the application processor 215 processes an image signal inputted
  • the application processor 215 further processes multimedia data stored in the
  • the application processor 215 includes an interface unit 250, a controller 255,
  • a multimedia processing unit 260 an image sealer 265, a priority control unit 325, a
  • first SM control unit 315 and a second SM control unit.
  • the interface unit 250 communicates data (e.g. a control signal) between the
  • processor 215 performs corresponding processing operation.
  • the controller 255 controls the operation of the application processor 215 by a
  • the controller 255 controls the operation of the application processor 215; reads data
  • the controller 255 can access a
  • partitioned block i.e. one of n partitioned blocks into which the shared
  • n being a natural number
  • the controller 255 can be a
  • MCU microcontroller
  • the multimedia processing unit 260 accesses a particular partitioned block of
  • the shared memory 310 through the first SM control unit 315 or the second SM control
  • the multimedia processing unit 260 reads image data (e.g. YUV data)
  • multimedia processing unit 260 can store the processed data in a storage area of the
  • the main processor 210 transfers data to the application processor
  • M-NV memory 220 or M-VO memory 225 coupled M-NV memory 220 or M-VO memory 225 through the MP-AP bus, and the other method is to store the data in a particular partitioned block and then to transfer
  • the main processor 210 To use the latter method, the main processor 210 must be
  • the image sealer 265 receives a data recognizing signal (e.g. a first recognizing
  • the image scalier 265 stores the image signal (i.e. the
  • the image sealer 265 processes
  • the image signal i.e. the YUV data
  • the image sealer 265 performs a preset imaging processing (e.g. generates a
  • the data processed by the image sealer 265 is stored in the shared
  • the image signal is stored in accordance with the control of the controller 255.
  • the data recognizing signal can be inputted into the controller 255.
  • the controller 255 which receives the data recognizing signal, can control the operation
  • the image sealer 265 of the present invention is merely one embodiment of an
  • the image signal e.g. YUV data or encoded image data
  • multimedia data input unit that needs to store multimedia data (e.g. image data and/or
  • the illustrated multimedia processing unit 260 is merely one
  • multimedia data processing unit that processes multimedia data stored in the shared
  • the priority control unit 325 determines the priority in response to a request for
  • the priority control unit 325 can permit the image signal (e.g. YUV data for a preview mode, shooting a movie file and generating
  • the priority control unit 325 can control the second SM control unit 320 such
  • the SM control unit 315 and the second SM control unit 320 controls elements
  • the first SM control unit 315 or the second SM control unit 320 which
  • the storage space of the partitioned block is used up.
  • the present invention can permit a plurality of elements to
  • FIG. 3 can be realized as a single chip.
  • a plurality of processors including the main processor 210 and the application
  • processors and at least a memory can be embodied as one chip.
  • FIG. 4 is a flow chart illustrating a method of outputting various image signals
  • FIG. 5 is a flow chart illustrating a method of processing an inputted image signal of the
  • the ISP 330 of the present invention includes a processing
  • YUV data or RGB data
  • RGB data a conventional processing module for converting RGB data to YUV data
  • the ISP 330 or the application processor 215 can generate the encoded data.
  • processor 210 (or a controller equipped in the main processor 210) informs the ISP 330
  • the application processor 215 can be preset to perform encoding process.
  • the data recognizing signal can have
  • the ISP 330 In a step represented by 415, the ISP 330 generates YUV data by using raw
  • the YUV data can be classified into high
  • resolution YUV data for generating encoded image data, and low resolution YUV data
  • the ISP 330 determines whether the data
  • recognizing signal received from the main processor 210 is a first data recognizing
  • step represented by 420 can be performed
  • the ISP 330 If the data recognizing signal is the first data recognizing signal, the ISP 330
  • the ISP If the data recognizing signal is not the first data recognizing signal, the ISP
  • image signal i.e. YUV data or encoded image data
  • FIG. 5 is a diagrammatic representation of FIG. 5.
  • the data recognizing circuit 210 receives a data recognizing signal from the main processor 210.
  • the data recognizing circuit 210 receives a data recognizing signal from the main processor 210.
  • the controller 255 can be inputted into the image sealer 265 or the controller 255.
  • the data recognizing signal can be communicated through an MP-AP bus for example.
  • the application processor 215 receives an image
  • the image signal from the camera module 305.
  • the image signal will be received through the
  • the application processor 215 determines whether
  • the data recognizing signal received from the main processor 210 is a second data
  • the step represented by 520 can be performed together in the step
  • the data recognizing signal is the second data recognizing signal
  • application processor 215 generates encoded image data (i.e. compressed data) by using
  • the received image signal i.e. YUV data
  • the received image signal i.e. YUV data
  • the generation of the compressed data can be
  • the image signal received for the processing of the multimedia processing unit 260 can be any image signal received for the processing of the multimedia processing unit 260 .
  • the data recognizing signal is not the first data recognizing signal
  • application processor 215 stores the received image signal (i.e. encoded image data) in
  • the present invention can minimize the time delay in the
  • the present invention can also guarantee the image continuity due to
  • the present invention can also allow a main processor to quickly transfer data
  • the present invention can also maximize the efficiency of storing and
  • processing data by allowing each of plural elements, requested to process or process the
  • the present invention can prevent data loss by removing the time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An imaging device and an image signal generating method thereof are disclosed. A camera module selectively outputs YUV data or encoded image data, which is outputted from an image sensor, in accordance with a control signal of a main processor. With the present invention, the processing of high quality image data can be performed in real time.

Description

[DESCRIPTION]
[Invention Title]
IMAGING DEVICE AND METHOD FOR TRANSFERRING IMAGE
SIGNAL
[Technical Field]
The present invention is directed to an imaging device, more specifically to an
imaging device for selectively transferring an image signal (e.g. YUV data or encoded
data) and an image signal generating method thereof.
[Background Art]
As an example of digital processing devices, portable terminals refer to
electronic devices that can be easily carried by making the size compact in order to
perform functions such as game and mobile communication. Portable terminals include
mobile communication terminals, personal digital assistants (PDA), portable
multimedia players (PMP) and car navigation devices. These portable terminals
function as an imaging device by mounting an element for imaging (e.g. an image
sensor) thereon.
The mobile communication terminal is essentially a device designed to enable
a mobile user to telecommunicate with a receiver who is remotely located. Thanks to scientific development, however, the latest mobile communication terminals have
functions, such as camera and multimedia data playback, in addition to the basic
functions, such as voice communication, short message service and address book.
FIG. 1 shows a block diagram of a conventional mobile communication
terminal having a camera function.
Referring to FIG. 1, the mobile communication terminal 100 having a camera
function comprises a high frequency processing unit 110, an analog-to-digital converter
115, a digital-to-analog converter 120, a processing unit 125, a power supply 130, a key
input 135, a main memory 140, a display 145, a camera 150, an image processing unit
155 and a support memory 160.
The high frequency processing unit 110 processes a high frequency signal,
which is transmitted or received through an antenna.
The analog-to-digital converter 115 converts an analog signal, outputted from
the high frequency processing unit 110, to a digital signal and sends to the processing
unit 125.
The digital-to-analog converter 120 converts a digital signal, outputted from
the processing unit 125, to an analog signal and sends to the high frequency processing
unit 110.
The processing unit 125 controls the general operation of the mobile
communication terminal 100. The processing unit 125 can comprise a central processing unit (CPU) or a micro-controller.
The power supply 130 supplies electric power required for operating the
mobile communication terminal 100. The power supply 130 can be coupled to, for
example, an external power source or a battery.
The key input 135 generates key data for, for example, setting various
functions or dialing of the mobile communication terminal 100 and sends the key data
to the processing unit 125.
The main memory 140 stores an operating system and a variety of data of the
mobile communication terminal 100. The main memory 140 can be, for example, a
flash memory or an EEPROM (Electrically Erasable Programmable Read Only
Memory).
The display 145 displays the operation status of the mobile communication
terminal 100, related information (e.g. data and time) and an external image
photographed by the camera 150.
The camera 150 photographs an external image (a photographic subject), and
the image processing unit 155 processes the external image photographed by the camera
150. The image processing unit 155 can perform functions such as color interpolation,
gamma correction, image quality correction and JPEG encoding. The camera 150 and
the image processing unit 155 can include an image sensor, an image signal processor
(ISP) and a camera processor. The support memory 160 stores the external image processed by the image
processing unit 155. The support memory 160 can be an SRAM (Static RAM) or an
SDRAM (Synchronous DRAM).
As described above, the mobile communication terminal 100 having a camera
function is equipped with a plurality of processing units (that is, a main processor and
one or more application processors for performing additional functions). In other words,
as shown in FIG. 1, the processing unit 125 for controlling general functions of the
mobile communication terminal 100 and the image processing unit 155 for controlling
the camera function are included. Each processing unit is structured to be coupled with
an independent memory. For example, the main processor can be a baseband chip.
The application processor can take different forms and quantity depending on
the kinds of additional functions, with which the portable terminal is equipped. For
example, the application processor for controlling the camera function can process
functions such as JPEG encoding and JPEG decoding; the application processor for
controlling the movie file playback function can process functions such as video file
(e.g., MPEG4, DIVX, H.264) encoding and decoding; and the application processor for
controlling the music file playback function can process functions such as audio file
encoding and decoding. The portable terminal can also comprise an application
processor for controlling games. Each of these processors has an individual memory for
storing the processed data. FIG. 2 is a coupling structure between a processor and a memory in accordance
with the prior art.
As illustrated in FIG. 2, a main processor 210 basically has two buses.
Typically, the bus refers to a common-purpose electric path used for transmitting
information between the processor, main memory and input / output device in a digital
processing apparatus. The bus includes a line for information, which represents an
address of each device or a location of each memory, and another line for distinguishing
various data transmission operation.
One bus is an MP (main processor)-AP (application processor) bus forming a
host interface to be coupled to the application processor 215. The other bus is an
MP-MM (main memory) bus to be coupled to M-NV memory 220 and M-VO memory
225. The M-NV memory 220 is a nonvolatile memory, and the M-VO memory 225 is a
volatile memory. The MP-MM bus can be classified into a first bus, which is coupled to
the M-NV memory 220, and a second bus, which is coupled to the M-VO memory 225.
The M-NV memory 220 and the M-VO memory 225 are embodied as one chip by
multi-chip package technology.
The application processor 215 is coupled to the main processor 210 through the
MP-AP bus, and a second volatile memory 245 through an AP-AM (application
memory) bus. The A-VO memory 245 is a volatile memory. Also, the application
processor 215 is coupled to the display 145 and the image sensor 240 through additional buses.
As illustrated in FIG. 2, in accordance with a conventional coupling structure,
each of the main processor 210 and the application processor 215 is equipped with a
dedicated memory. Accordingly, in case that the main processor 210 displays data (e.g.
MPEG file) stored in the M-VO memory 225 through the display 145, the main
processor 210 must read respective data and transfer the read data to the application
processor 215 through the MP-AP bus. The application processor 215 processes (e.g.
decodes) the data transferred from the main processor 210 and displays the processed
data through the display 145. In this case, if large data is processed or displayed, the
application processor 215 can store the respective data in the A-VO 245 and read the
stored data at a desired time and process or transfer it to the display 145.
As described above, the conventional coupling structure, the larger the data
communicating between the main processor 210 and the application processor 215 is,
the less efficient the process of the main processor 210 and the application processor
215 becomes. This is because the main processor 210 must read and transfer the large
data, and the application processor 215 must write the transferred data in the A-VO
memory 245. Also, when processing respective data, an inside element of the
application processor 215 uses the AP-AM bus to access the A-VO memory 245.
The above problem is highlighted in case the application processor 215
processes a high quality image signal. This is because the time of occupying a system bus is increased as the size of data inputted from the image sensor 240 is exponentially
increased, due to the development of the image sensor 240 having larger numbers of
pixels.
Also, this problem can cause a bottleneck in which each element in the
application processor 215 accesses the A-VO memory 245 to process data received
from the main processor 210 or image data inputted from the image sensor 240.
[Disclosure]
[Technical Problem]
In order to solve aforementioned problems, the present invention provides an
imaging device and an image signal generating method thereof that can minimize the
time delay in the high quality image processing and maximize the processing efficiency
of an application processor.
The present invention also provides an imaging device and an image signal
generating method thereof that can successively write image data inputted from an
imager sensor.
The present invention also provides an imaging device and an image signal
generating method thereof that can allow a main processor to quickly transfer data to an
application processor by using a shared memory.
The present invention also provides an imaging device and an image signal generating method thereof that can maximize the efficiency of storing and processing
data by allowing each of plural elements, requested to store or process the data, to use
its dedicated storage area through its dedicated path.
n addition, the present invention provides an imaging device and an image
signal generating method thereof that can prevent data loss by removing the time delay
in storing image data inputted from an image sensor.
Other objects of the present invention will become apparent through the
preferred embodiments described below.
[Technical Solution]
To achieve the above objects, an aspect of the present invention features an
image signal processor generating an image signal and a digital processing device
having the image signal processor.
According to an embodiment of the present invention, the digital processing
device includes a main processor; a memory, where a storage area is partitioned into n
partitioned blocks, n being a natural number; a camera module, generating an image
signal corresponding to a control signal from the main processor by using the raw data
outputted from an image sensor and outputting the generated image signal, the camera
module comprising a YUV data generating unit and an encoding unit, and the image
signal being YUV data generated by use of the raw data or encoded image data generated by use of the YUV data; and an application processor, being coupled to each
of the main processor and the camera module, and generating and storing in the memory
pertinent encoded image data if the imaging signal is the YUV data, by a control signal
from the main processor, and storing in the memory the inputted encoded image data if
the image signal is the encoded image data.
The application processor can have a multimedia data input unit, storing the
image signal inputted from the camera module in a first partitioned block, and a
multimedia data processing unit, reading the image signal stored in the first partitioned
block, and generating encoded image data and storing the generated encoded image data
in a second partitioned block if the image signal is the YUV data.
The memory and the application processor can be coupled to each other
through a plurality of memory buses. Also, the application processor and the memory
can be realized in the same chip.
The control signal, which determines the type of the signal processor, can be
determined depending on the resolution size of the raw data.
According to another embodiment of the present invention, the image signal
processor (ISP) that processes and outputs raw data inputted from an image sensor
includes a first input unit, being inputted with the raw data; a second unit, receiving a
control signal which instructs to generate an image signal of a particular type from a
main processor; a YUV data generating unit, generating YUV data by using the raw data and outputting the YUV data; an encoding unit, generating encoded image data
according to a predestinated encoding method by using the YUV data, and outputting
the encoded image data; and a controller, activating the YUV data according to the
control signal or activating the YUV data generating unit and the encoding unit.
The control signal, which determines the type of the signal processor, can be
determined depending on the resolution size of the raw data.
In order to achieve the above objects, another aspect of the present invention
features a method of generating an image signal (e.g. YUV data or encoded data) and/or
a recorded medium recording a program executing the method thereof.
According to an embodiment of the present invention, the image signal
generating method of a digital processing device includes an image sensor generating
and outputting raw data; a camera module generating an image signal corresponding to
a control signal received from a main processor by using the raw data, and outputting
the generated image signal, the image signal being YUV data generated by use of the
raw data or encoded image data generated by use of the YUV data; and an application
processor determining the type of the image signal by using a control signal received
from the main processor, and generating pertinent encoded image data memory if the
image signal is the YUV data and storing the generated encoded image data in the
memory, and storing the image signal in the memory if the image signal is the encoded
image data. The control signal, which determines the type of the signal processor, can be
determined depending on the resolution size of the raw data.
The memory and the application processor can be coupled to each other
through a plurality of memory buses.
[Description of Drawings]
FIG. 1 is a block diagram illustrating a conventional mobile communication
terminal having a camera function;
FIG. 2 illustrates a coupling structure between a processor and a memory in
accordance with the prior art;
FIG. 3 illustrates a linking structure between each processor in accordance with
an embodiment of the present invention;
FIG. 4 is a flow chart illustrating a method of outputting various image signals
by a camera module in accordance with an embodiment of the present invention; and
FIG. 5 is a flow chart illustrating a method of processing an inputted image
signal by an application processor in accordance with an embodiment of the present
invention.
[Mode for Invention]
The above objects, features and advantages will become more apparent through the below description with reference to the accompanying drawings.
Since there can be a variety of permutations and embodiments of the present
invention, certain embodiments will be illustrated and described with reference to the
accompanying drawings. This, however, is by no means to restrict the present invention
to certain embodiments, and shall be construed as including all permutations,
equivalents and substitutes covered by the spirit and scope of the present invention.
Throughout the description of the present invention, when describing a certain
technology is determined to evade the point of the present invention, the pertinent
detailed description will be omitted.
Terms such as "first" and "second" can be used in describing various elements,
but the above elements shall not be restricted to the above terms. The above terms are
used only to distinguish one element from the other. For instance, the first element can
be named the second element, and vice versa, without departing the scope of claims of
the present invention. The term "and/or" shall include the combination of a plurality of
listed items or any of the plurality of listed items .
When one element is described as being "connected" or "accessed" to another
element, it shall be construed as being connected or accessed to the other element
directly but also as possibly having another element in between. On the other hand, if
one element is described as being "directly connected" or "directly accessed" to another
element, it shall be construed that there is no other element in between. The terms used in the description are intended to describe certain embodiments
only, and shall by no means restrict the present invention. Unless clearly used otherwise,
expressions in the singular number include a plural meaning. In the present description,
an expression such as "comprising" or "consisting of is intended to designate a
characteristic, a number, a step, an operation, an element, a part or combinations thereof,
and shall not be construed to preclude any presence or possibility of one or more other
characteristics, numbers, steps, operations, elements, parts or combinations thereof.
Unless otherwise defined, all terms, including technical terms and scientific
terms, used herein have the same meaning as how they are generally understood by
those of ordinary skill in the art to which the invention pertains. Any term that is
defined in a general dictionary shall be construed to have the same meaning in the
context of the relevant art, and, unless otherwise defined explicitly, shall not be
interpreted to have an idealistic or excessively formalistic meaning.
Hereinafter, preferred embodiments will be described in detail with reference
to the accompanying drawings. Identical or corresponding elements will be given the
same reference numerals, regardless of the figure number, and any redundant
description of the identical or corresponding elements will not be repeated.
Although it is evident that the present invention can be equivalently applied to
all types of digital processing devices or systems (e.g. portable terminals and/or home
digital appliances, such as the mobile communication terminal, PDA, portable multimedia player (PMP), MP3 player, digital camera, digital television, audio
equipment, etc.), which has a plurality of processors and in which a particular memory
needs to be shared by a plurality of processors or a plurality of elements included in one
processor needs to share a memory at the same time, the portable terminal will be
described hereinafter for the convenience of description and understanding. Moreover,
it shall be easily understood through the below description that the present invention is
not limited to a specific type of terminal but is applicable equivalently to any terminal
having a memory shared by a plurality of processors or elements.
FIG. 3 illustrates a linking structure between each processor in accordance with
an embodiment of the present invention.
The below description assumes that the application processor 215 controls a
camera module 305 and the application processor 215 is a multimedia processor for
processing an image signal (e.g. YUV data or encoded image data) inputted from the
camera module 305. Also, the shared memory 310, coupled to the application processor
215, can be shared by each element included in the main processor 210 and the
application processor 215. However, the below description relates to the case only that
the shared memory 310 is shared by each element of the application processor 215 (e.g.
a controller 255, a multimedia processing unit 260 and an image sealer 265).
The storage area of the shared memory 310 can be partitioned into n partitioned blocks, n being a natural number. The use for each partitioned block can be
predetermined. For example, a first partitioned block can be predetermined for each of
the elements to process respective data; a second partitioned block can be
predetermined to store encoded data corresponding to an image signal inputted from the
camera module 305; and a third partitioned block can be predetermined to store image
data for performing a preview mode (i.e. a previewing state before photographing a
photographic subject).
Of course, two partitioned blocks or more are predestinated to successively
store the data inputted from the camera module 305 such that once one of the two
partitioned blocks completes storing a first part of the data, the other partitioned block
can follow storing a second part of the data, and at the same time, the first part
completed being stored can be processed when the second part is being stored. This is
because the data inputted from the camera module 305 is real time data, of which image
continuity is an important factor.
If the shared memory 310 is coupled to the main processor 210 and used for
data transmission, a particular partitioned block can be assigned for the data
transmission. In this case, the shared memory 310 further has an access port. The
partitioned block for the data transmission is restricted to be accessed by a plurality of
processors at the same time. The information on which processor contacts the block
must be shared between each processor through the renewing of a preset register value or the communication of contacting status information.
By using raw data (e.g. RGB data) outputted from the image sensor 240, the
image signal processor (ISP) 330 or the application processor 215 can generate data
encoded by a predestinated encoding method. It can be determined by a data
recognizing signal inputted from the main processor 210 which generates the encoded
data.
For example, if a setting mode for photographing is designated for high
resolution, the ISP 330 can be preset to perform encoding process. If the setting mode
for photographing is designated for low resolution, the application processor 215 can be
preset to perform encoding process. Of course, it can be vice-versa as well. The below
description, however, assumes the former case. For this, the ISP 330 must have a
processing module, for generating encoded data by use of YUV data (or RGB data), in
addition to a conventional processing module for converting RGB data to YUV data. Of
course, the ISP 330 can further include an input unit, which is inputted with raw data,
another input unit, which is inputted with a control signal from the main processor 210,
and a controller, which controls to generate an image signal (i.e. YUV data or encoded
image data).
The application processor 215 also must integrally or individually have
interface means that can receive not only YUV data but also encoded image data from
the camera module 305. This is because since the YUV data has a signal wave type in which only valid data is successively inputted, but the encoded data has another signal
type in which valid data and invalid data are coexistent, the interface structure thereof
may be varied.
Referring to FIG. 3, the application processor 215 of the present invention is
coupled to the main processor 210 through an MP-AP bus (e.g. a host interface) and the
shared memory 310 having two ports through a first SM (shared memory) bus and a
second SM bus. Also, the application processor 215 processes an image signal inputted
from the image sensor 240 and stores the processed image signal in the shared memory
310. The application processor 215 further processes multimedia data stored in the
shared memory 310 and displays the processed multimedia data through the display
145.
The application processor 215 includes an interface unit 250, a controller 255,
a multimedia processing unit 260, an image sealer 265, a priority control unit 325, a
first SM control unit 315 and a second SM control unit.
The interface unit 250 communicates data (e.g. a control signal) between the
application processor 215 and the main processor 210. Once the control signal is
received from the main processor 210 through the interface unit 250, the application
processor 215 performs corresponding processing operation.
The controller 255 controls the operation of the application processor 215 by a
built-in program for the driving of the application processor 215. In other words, the controller 255 controls the operation of the application processor 215; reads data,
requested for executing the program, from the shared memory; and stores the processed
programming result in the shared memory 310. The controller 255 can access a
particular partitioned block (i.e. one of n partitioned blocks into which the shared
memory 310 is partitioned, n being a natural number). Typically, the controller 255
controls the operation of the application processor 215 corresponding to the control
signal received from the main processor 210. The controller 255 can be a
microcontroller (MCU) for example.
The multimedia processing unit 260 accesses a particular partitioned block of
the shared memory 310 through the first SM control unit 315 or the second SM control
unit 320. Then, the multimedia processing unit 260 reads image data (e.g. YUV data)
stored in the partitioned block to encode the read image data by a predestinated format
or to add necessary effects to the imaging data. The multimedia processing unit 260
further reads and decodes compressed data, transferred from the main processor 210 and
stored in the shared memory 310, before displaying it to the display 145. The
multimedia processing unit 260 can store the processed data in a storage area of the
shared memory 310. The main processor 210 transfers data to the application processor
215 though following methods, one of which is to transfer the data read from the
coupled M-NV memory 220 or M-VO memory 225 through the MP-AP bus, and the other method is to store the data in a particular partitioned block and then to transfer
through the MP-AP bus a control signal for allowing the application processor 215 to
contact the partitioned block. To use the latter method, the main processor 210 must be
coupled to the shared memory 310, and the shared memory 310 must further have an
additional port for being assigned to the main processor 210.
The image sealer 265 receives a data recognizing signal (e.g. a first recognizing
signal for representing high resolution data or a second recognizing signal for
representing low resolution data) and carries out corresponding operation. If the first
recognizing signal is inputted, the image scalier 265 stores the image signal (i.e. the
encoded data) inputted from the camera module 305 in the shared memory 310.
However, if the second recognizing signal is inputted, the image sealer 265 processes
and stores in the shared memory 310 the image signal (i.e. the YUV data) inputted from
the camera module 305. In other words, in case that the second recognizing signal is
inputted, the image sealer 265 performs a preset imaging processing (e.g. generates a
softened image through size adjustment, color change and filtering of the image) of the
image signal inputted from the camera module 305 in accordance with the control of the
controller 255. The data processed by the image sealer 265 is stored in the shared
memory 310 through the second SM memory bus by the second SM control unit 320. hi
case a partitioned block for storing a respective image signal is predetermined, the image signal is stored in accordance with the control of the controller 255.
Of course, the data recognizing signal can be inputted into the controller 255.
The controller 255, which receives the data recognizing signal, can control the operation
of the image sealer 265.
The image sealer 265 of the present invention is merely one embodiment of an
element storing the image signal (e.g. YUV data or encoded image data) in the shared
memory 310. It shall be evident that the present invention can be widely applied to any
multimedia data input unit that needs to store multimedia data (e.g. image data and/or
audio data) in real time in the shared memory 310.
Similarly, the illustrated multimedia processing unit 260 is merely one
embodiment of an element processing multimedia data stored in the shared memory 310,
and it shall be evident that the present invention can be widely applied to any
multimedia data processing unit that processes multimedia data stored in the shared
memory 310 and stores the processed data in the shared memory 310 again, displays the
data through the display 145 or sends the data to the main processor 210.
The priority control unit 325 determines the priority in response to a request for
access to the shared memory 310 by each element in the application processor 215 and
controls each of the first SM control unit 315 and the second SM control unit 320 such
that the two elements can access the shared memory 310 through the first SM bus and
the second SM bus, respectively. The priority control unit 325, however, can permit the image signal (e.g. YUV data for a preview mode, shooting a movie file and generating
encoded data) inputted from the image scalier 265 to have the top priority. In other
words, the priority control unit 325 can control the second SM control unit 320 such
that the multimedia data (i.e. an image signal and/or an audio signal) inputted from the
image sealer 265 can be stored real time in a particular partitioned block of the shared
memory 310 through the second SM bus.
The SM control unit 315 and the second SM control unit 320 controls elements,
determined by the priority control unit 325, to contact the partitioned block of the
shared memory 310 through the first SM bus and the second SM bus, respectively. As
described above, the first SM control unit 315 or the second SM control unit 320, which
has set a path to allow the image signal inputted from the camera module 305 to be
stored in any one of preset plural partitioned blocks, can re-set the path to allow the
image signal, which continues to be inputted, to be stored in another partitioned block if
the storage space of the partitioned block is used up.
As described above, the present invention can permit a plurality of elements to
simultaneously perform processes by partitioning the storage area of the shared memory
310 into a plurality of partitioned blocks; allowing an image signal inputted from the
camera module 305 to be stored real time in the shared memory 310 through a first one
of two access ports, that is, permitting the access request of an element through the first
one of two access ports; and allowing the access request of another element to be permitted in real time through a second access port. Accordingly, the present invention
can minimize the standby time of each element for using the shared memory 310. The
conventional data size of the image sensor 240 has been 640 x 480 pixels, and today's
size is 1280 x 1024 pixels. It is expected that the future's size would be 1920 x 1200 or
2560 x 2048 pixels. Also, the size of image data is expected to be increased. With the
present invention, the problems, caused by the time delay in storing the large size data
in a storage device, can be solved.
Two elements or more of the main processor 210, the application processor
220 and the memory unit 310, illustrated in FIG. 3 can be realized as a single chip. For
example, a plurality of processors including the main processor 210 and the application
220 can be realized as a single chip. Any one processor and the memory unit 310 can be
alternatively embodied as one chip. Of course, it is evident that the plurality of
processors and at least a memory can be embodied as one chip.
FIG. 4 is a flow chart illustrating a method of outputting various image signals
by the camera module in accordance with the embodiment of the present invention, and
FIG. 5 is a flow chart illustrating a method of processing an inputted image signal of the
application processor in accordance with the embodiment of the present invention.
As described above, the ISP 330 of the present invention includes a processing
module, for generating encoded data by use of YUV data (or RGB data), in addition to a conventional processing module for converting RGB data to YUV data. Accordingly,
the ISP 330 or the application processor 215 can generate the encoded data. The main
processor 210 (or a controller equipped in the main processor 210) informs the ISP 330
and the application processor 215, by using a data recognizing signal, which generates
the encoded data. The below description assumes that if the setting mode for
photographing is designated for high resolution, the ISP 330 can be preset to perform
encoding process, and if the setting mode for photographing is designated for low
resolution, the application processor 215 can be preset to perform encoding process.
Referring to FIG. 4, in a step represented by 410, the ISP 330 receives a data
recognizing signal from the main processor 210. The data recognizing signal can have
resolution information of raw data, outputted through the image sensor 240, or a process
command (e.g. a control command for instructing to output YUV data or encoded image
data). The data recognizing signal can be received or transmitted through a conventional
communication method such as I2C for example.
In a step represented by 415, the ISP 330 generates YUV data by using raw
data inputted from the image sensor 240. The YUV data can be classified into high
resolution YUV data, for generating encoded image data, and low resolution YUV data,
for performing a preview mode. In this description, the "YUV data" is commonly
called. In a step represented by 420, the ISP 330 determines whether the data
recognizing signal received from the main processor 210 is a first data recognizing
signal (i.e. a control signal for representing high resolution data or for instructing to
generate compressed data). Of course, the step represented by 420 can be performed
together in the step represented by 410.
If the data recognizing signal is the first data recognizing signal, the ISP 330
generates encoded image data (i.e. compressed data) in a step represented by 425 and
outputs the generated compressed data to the application processor 215 in a step
represented by 430.
If the data recognizing signal is not the first data recognizing signal, the ISP
330 outputs the YUV data generated through the step represented by 415 to the
application processor 215 in a step represented by 435.
Then, the processing operation of the application processor 215 receiving an
image signal (i.e. YUV data or encoded image data) will be described with reference to
FIG. 5.
Referring to FIG. 5, in a step represented by 510, the application processor 215
receives a data recognizing signal from the main processor 210. The data recognizing
signal can be inputted into the image sealer 265 or the controller 255. The controller
255, which receives the data recognizing signal, can control the operation of the image
sealer 265. The data recognizing signal can be communicated through an MP-AP bus for example.
In a step represented by 515, the application processor 215 receives an image
signal from the camera module 305. The image signal will be received through the
image sealer 265.
In a step represented by 520, the application processor 215 determines whether
the data recognizing signal received from the main processor 210 is a second data
recognizing signal. The step represented by 520 can be performed together in the step
represented by 510.
If the data recognizing signal is the second data recognizing signal, the
application processor 215 generates encoded image data (i.e. compressed data) by using
the received image signal (i.e. YUV data) in a step represented by 525 and stores the
generated compressed data in a predestinated partitioned block of the shared memory
310 in a step represented by 530. Here, the generation of the compressed data can be
performed in the multimedia processing unit 260 by the control of the controller 255.
The image signal received for the processing of the multimedia processing unit 260 can
be temporally stored in the shared memory 310.
If the data recognizing signal is not the first data recognizing signal, the
application processor 215 stores the received image signal (i.e. encoded image data) in
the predestinated partitioned block in a step represented by 535. The drawings and detailed description are only examples of the present
invention, serve only for describing the present invention and by no means limit or
restrict the spirit and scope of the present invention. Thus, any person of ordinary skill
in the art shall understand that a large number of permutations and other equivalent
embodiments are possible. The true scope of the present invention must be defined only
by the spirit of the appended claims.
[industrial Applicability]
As described above, the present invention can minimize the time delay in the
high quality image processing and maximize the processing efficiency of an application
processor.
The present invention can also guarantee the image continuity due to
successive writing of image data inputted from an imager sensor.
The present invention can also allow a main processor to quickly transfer data
to an application processor by using a shared memory.
The present invention can also maximize the efficiency of storing and
processing data by allowing each of plural elements, requested to process or process the
data, to use its dedicated storage area through its dedicated path.
In addition, the present invention can prevent data loss by removing the time
delay in storing image data inputted from an image sensor.

Claims

[CLAIMS]
[Claim 1 ]
A digital processing device, comprising:
a main processor;
a memory, where a storage area is partitioned into n partitioned blocks, n being
a natural number;
a camera module, generating an image signal corresponding to a control
signal from the main processor by using the raw data outputted from an image sensor
and outputting the generated image signal, the camera module comprising a YUV data
generating unit and an encoding unit, and the image signal being YUV data generated
by use of the raw data or encoded image data generated by use of the YUV data; and
an application processor, being coupled to each of the main processor and the
camera module, and generating and storing in the memory pertinent encoded image data
if the imaging signal is the YUV data, by a control signal from the main processor, and
storing in the memory the inputted encoded image data if the image signal is the
encoded image data.
[Claim 2]
The digital processing device of Claim 1, wherein the application processor
comprises a multimedia data input unit, storing the image signal inputted from the camera module in a first partitioned block, and a multimedia data processing unit,
reading the image signal stored in the first partitioned block, and generating encoded
image data and storing the generated encoded image data in a second partitioned block
if the image signal is the YUV data.
[Claim 3]
The digital processing device of Claim 2, wherein the memory and the
application processor are coupled to each other through a plurality of memory buses.
[Claim 4]
The digital processing device of Claim 1, wherein the control signal, which
determines the type of the signal processor, is determined depending on the resolution
size of the raw data.
[Claim 5]
The digital processing device of Claim 1, wherein the application processor
and the memory are realized in the same chip.
[Claim 6]
An image signal processor (ISP) that processes and outputs raw data inputted from an image sensor, the ISP comprising:
a first input unit, being inputted with the raw data;
a second unit, receiving a control signal which instructs to generate an image
signal of a particular type from a main processor;
a YUV data generating unit, generating YUV data by using the raw data and
outputting the YUV data;
an encoding unit, generating encoded image data according to a predestinated
encoding method by using the YUV data, and outputting the encoded image data; and
a controller, activating the YUV data according to the control signal or
activating the YUV data generating unit and the encoding unit.
[Claim 7]
The image signal processor of Claim 6, wherein the control signal, which
determines the type of the signal processor, is determined depending on the resolution
size of the raw data.
[Claim 8]
An image signal generating method of a digital processing device, the method
comprising:
an image sensor generating and outputting raw data; a camera module generating an image signal corresponding to a control signal
received from a main processor by using the raw data, and outputting the generated
image signal, the image signal being YUV data generated by use of the raw data or
encoded image data generated by use of the YUV data; and
an application processor determining the type of the image signal by using a
control signal received from the main processor, and generating pertinent encoded
image data memory if the image signal is the YUV data and storing the generated
encoded image data in the memory, and storing the image signal in the memory if the
image signal is the encoded image data.
[Claim 9]
The image signal generating method of Claim 8, wherein the control signal,
which determines the type of the signal processor, is determined depending on the
resolution size of the raw data.
[Claim 10]
The image signal generating method of Claim 8, wherein the memory and the
application processor are coupled to each other through a plurality of memory buses.
PCT/KR2006/005612 2005-12-26 2006-12-21 Imaging device and method for transferring image signal WO2007075000A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020050129736A KR100663380B1 (en) 2005-12-26 2005-12-26 Imaging device and method for transferring image signal
KR10-2005-0129736 2005-12-26

Publications (1)

Publication Number Publication Date
WO2007075000A1 true WO2007075000A1 (en) 2007-07-05

Family

ID=37866580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2006/005612 WO2007075000A1 (en) 2005-12-26 2006-12-21 Imaging device and method for transferring image signal

Country Status (2)

Country Link
KR (1) KR100663380B1 (en)
WO (1) WO2007075000A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9628679B2 (en) 2008-12-29 2017-04-18 Red.Com, Inc. Modular motion camera
US9712728B2 (en) 2008-12-29 2017-07-18 Red.Com, Inc. Modular digital camera for use with multiple recording modules
US10116776B2 (en) 2015-12-14 2018-10-30 Red.Com, Llc Modular digital camera and cellular phone
US10271031B2 (en) 2014-04-04 2019-04-23 Red.Com, Llc Broadcast module for a digital camera

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2748451C (en) * 2008-12-29 2016-10-04 Red.Com, Inc. Modular digital camera
US9681028B2 (en) 2013-03-15 2017-06-13 Red.Com, Inc. Digital camera with wireless connectivity

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6137946A (en) * 1997-04-04 2000-10-24 Sony Corporation Picture editing apparatus and method using virtual buffer estimation
US6522830B2 (en) * 1993-11-30 2003-02-18 Canon Kabushiki Kaisha Image pickup apparatus
US6798418B1 (en) * 2000-05-24 2004-09-28 Advanced Micro Devices, Inc. Graphics subsystem including a RAMDAC IC with digital video storage interface for connection to a graphics bus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6522830B2 (en) * 1993-11-30 2003-02-18 Canon Kabushiki Kaisha Image pickup apparatus
US6137946A (en) * 1997-04-04 2000-10-24 Sony Corporation Picture editing apparatus and method using virtual buffer estimation
US6798418B1 (en) * 2000-05-24 2004-09-28 Advanced Micro Devices, Inc. Graphics subsystem including a RAMDAC IC with digital video storage interface for connection to a graphics bus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9628679B2 (en) 2008-12-29 2017-04-18 Red.Com, Inc. Modular motion camera
US9712728B2 (en) 2008-12-29 2017-07-18 Red.Com, Inc. Modular digital camera for use with multiple recording modules
US10271031B2 (en) 2014-04-04 2019-04-23 Red.Com, Llc Broadcast module for a digital camera
US10116776B2 (en) 2015-12-14 2018-10-30 Red.Com, Llc Modular digital camera and cellular phone
US11165895B2 (en) 2015-12-14 2021-11-02 Red.Com, Llc Modular digital camera and cellular phone

Also Published As

Publication number Publication date
KR100663380B1 (en) 2007-01-02

Similar Documents

Publication Publication Date Title
US7587524B2 (en) Camera interface and method using DMA unit to flip or rotate a digital image
WO2006101292A1 (en) Variable partitioned blocks in shared memory
WO2007075000A1 (en) Imaging device and method for transferring image signal
US8151136B2 (en) Method and device for correcting code data error
US7545416B2 (en) Image processing device and camera including CPU which determines whether processing performed using external memory
KR100728650B1 (en) Method and apparatus for sharing multi-partitioned memory through a plurality of routes
US8145852B2 (en) Device having shared memory and method for providing access status information by shared memory
KR100731969B1 (en) Method and apparatus for sharing memory through a plurality of routes
KR100592106B1 (en) Method and apparatus for allowing access to individual memory
KR100736902B1 (en) Method and apparatus for sharing memory by a plurality of processors
JP2001238189A (en) Image processing apparatus, and operation control method for the same
US20080158364A1 (en) Image Processing Device and Data Processing Method
US20100002099A1 (en) Method and apparatus for sharing memory
KR100715522B1 (en) Camera control apparatus, image data displaying apparatus and method thereof
KR100658588B1 (en) Memory sharing system and method thereof
KR100909025B1 (en) A portable terminal having a memory sharing method and a memory sharing structure by a plurality of processors
KR100592109B1 (en) Method for controlling access to partitioned blocks of shared memory and portable terminal having shared memory
KR100658591B1 (en) Method and apparatus for controlling display using shared memory
KR100888427B1 (en) Device having shared memory and method for displaying data
WO2007021154A1 (en) Memory sharing by a plurality of processors
JP2000010685A (en) Image input/output device and cable for the image input/ output device
JPH08292927A (en) Information processor
JP2002218539A (en) Integrated circuit for image processing
JPH10262202A (en) Image processing unit
KR20030073641A (en) Data interface unit for personal digital assistant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06835316

Country of ref document: EP

Kind code of ref document: A1