CN106021445A - Cached data loading method and apparatus - Google Patents

Cached data loading method and apparatus Download PDF

Info

Publication number
CN106021445A
CN106021445A CN201610324104.7A CN201610324104A CN106021445A CN 106021445 A CN106021445 A CN 106021445A CN 201610324104 A CN201610324104 A CN 201610324104A CN 106021445 A CN106021445 A CN 106021445A
Authority
CN
China
Prior art keywords
data
caching
time
information
load
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610324104.7A
Other languages
Chinese (zh)
Other versions
CN106021445B (en
Inventor
王福财
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201610324104.7A priority Critical patent/CN106021445B/en
Publication of CN106021445A publication Critical patent/CN106021445A/en
Application granted granted Critical
Publication of CN106021445B publication Critical patent/CN106021445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a cached data loading method and apparatus. The method comprises the steps of scanning a service access layer and identifying an interface containing preset automatic loading information; loading data, corresponding to the identified interface containing the automatic loading information, in a database according to preset cache parameter information, and performing caching; and when data of an accessed business is the data corresponding to the interface containing the automatic loading information, reading the cached data to realize business access processing, wherein the cache parameter information includes preset data discrimination identifier information, caching condition information and/or expiration time. According to embodiments of the cached data loading method and apparatus, the data is cached through the automatic loading information and the cache parameter information, so that system avalanche caused by large amounts of concurrent operations during cache failure is avoided; and furthermore, the cached data is sorted and a cache is judged, so that the utilization efficiency of the data in the cache and the reading efficiency of the cached data are improved.

Description

A kind of load data cached method and device
Technical field
Present document relates to but be not limited to data processing technique, a kind of load data cached method and device.
Background technology
Along with the continuous expansion of internet scale, Internet user group's is growing, to electricity business website also Propose new requirement.When website face million be frequently visited by the user time, the direct shadow of response speed of system Ring user and access the experience of website.
By using caching technology to improve accelerating website access one of which key technology.At a lot of large-scale nets On standing, (Redis is a use American National Standards Institute (ANSI) increased income to be widely used Redis (ANSI) C language write, support network, can also can the log type of persistence, key assignments based on internal memory (Key-Value) data base, and the application programming interface (API) of polyglot is provided), (Memcached is a high performance distributed memory target cache system to memcached, is used for moving State webpage (Web) apply to alleviate database loads) etc. associated internal memory caching technology.But internal memory simultaneously Caching technology is also limited by the physical memory size of system, opening virtual memory function when, if Internal memory is used up, and memory caching technology can store in disk the data being not frequently used, and this is by interior Deposit what caching technology determined, be equivalent to operation layer and caching exchanging policy is transferred to memory caching technology and done, Lack the motility of data buffer storage to a certain extent;If virtual memory function is banned, internal memory delays The technology of depositing will cause relate to data cached web site traffic performance meeting with the virtual memory of operating system Drastically decline.Memory caching technology can also limit, by config option, the physical memory that can use, when When reaching internal memory SC service ceiling threshold values, though give make mistake writing commands prompting (but by continuations accept oneself Read command), if now there being a large amount of concurrent operations, the number accessing in data Layer by being directed through caching According to, cause system snowslide.It addition, in correlation technique, data cached cache hit rate is the highest.
To sum up, when the existing memory caching technology closed exists cache invalidation, a large amount of concurrent operations cause system snowslide Problem.
Summary of the invention
The following is the general introduction to the theme described in detail herein.This general introduction is not to limit claim Protection domain.
The embodiment of the present invention provides a kind of and loads data cached method and device, it is possible to avoid due to caching During inefficacy, a large amount of concurrent operations cause system snowslide.
Embodiments provide and a kind of load data cached device, including: recognition unit, caching Unit and reading unit;Wherein,
Recognition unit is used for, scan service access layer, identifies and includes the automatic load information pre-set Interface;
Buffer unit is used for, according in default cached parameters information loading of databases with comprising of recognizing There are data that automatic load information interface is corresponding and cache;
Reading unit is used for, and the data of the business of access are corresponding with the interface including automatic load information Data time, read caching data, it is achieved Operational Visit process;
Described cached parameters information includes: the data separation identification information of default settings and/or caching condition Information and/or expired time;
Described data separation identification information includes: be combined according to interface parameters and/or docking port parameter Use the method function preset to carry out conversion to generate.
Optionally, described buffer unit is additionally operable to, and the data of the business of described access are and include automatically During data corresponding to the interface of load information, if not reading described access when reading the data of caching The data of business, from data base to access business data corresponding to include automatic load information The data of interface load and cache.
Optionally, described device also includes updating block,
Updating block is used for, data occur update time, if update data for include described automatically Data on the interface of load information, then update the data corresponding with the business updated comprised in caching.
Optionally, described device also includes caching process unit, comprises the described of all cachings for setting up The message queue of the data state info of data;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info, according to read data state info carry out data caching judgement process;
Described data state info is gather in advance, including the information of at least one of:
The data of described caching at previous request time and/or,
The data of described caching load number of times and load each time described caching data time and/ Or,
The data of described caching are accessed number of times in preset duration.
Optionally, caching process unit specifically for,
Set up the message queue of the data state info of the described data comprising all cachings;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info;
Read described data state info include described caching data at previous request time Time,
Current time in system is more than with the described difference at previous request time of the data of described caching During the requesting interval threshold value preset, determine that the described data of caching are non-hotspot caching data, will determine as Non-hotspot caching data are removed from the cache;And/or,
The described data state info read includes the loading number of times of the data of described caching and loads each time During time of the data of described caching,
If the number of times that loads of the data of described caching is more than the loading frequency threshold value and/or described caching preset Data load time more than preset load time threshold value, determine the data of described caching be focus but The data that time is controlled, will determine as focus but time controlled data are removed from the cache;And/or,
The described data state info read includes accessed time in preset duration of the data of described caching During number,
The data of described caching accessed number of times in described preset duration is less than the access times threshold preset Value, determines that described data are non-hotspot caching data, will determine as non-hotspot caching data and deletes from caching Remove.
Optionally, described buffer unit is additionally operable to,
When cached parameters information includes expired time, obtain the expired time of the data of described caching, and Load and cache the handling duration of described data from data base, expired time is deducted handling duration acquisition and carries The pre-treatment time;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number The number of the data cached corresponding renewal reached with expired time when reaching according to expired time described in storehouse According to, and cache.
Optionally, described device also includes sequencing unit,
Sequencing unit is used for, the data to described caching, according to expired time and/or load time-consuming duration, And/or the data of caching are ranked up by the request frequency of data.
On the other hand, the embodiment of the present invention additionally provides and a kind of loads data cached method, including:
Scan service access layer, identifies the interface including the automatic load information pre-set;
According in default cached parameters information loading of databases with recognize include automatic load information Data that interface is corresponding also cache;
When the data of the business accessed are the data corresponding with the interface including automatic load information, read The data of caching, it is achieved Operational Visit processes;
Described cached parameters information includes: the data separation identification information of default settings and/or caching condition Information and/or expired time;
Described data separation identification information includes: be combined according to interface parameters and/or docking port parameter Use the method function preset to carry out conversion to generate.
Optionally, the data of the business of described access are corresponding with the interface including automatic load information During data, if not reading the data of the described business of access, described method when reading the data of caching Also include: from data base to access business data corresponding to include connecing of automatic load information The data of mouth load and cache.
Optionally, described method also includes:
When data occur to update, if the data updated are and the interface including described automatic load information On data, then update that caching comprises with the corresponding data of business that are that update.
Optionally, described method also includes:
Set up the message queue of the data state info of the described data comprising all cachings;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info, according to read data state info carry out data caching judgement process;
Described data state info is gather in advance, including the information of at least one of:
The data of described caching at previous request time and/or,
The data of described caching load number of times and load each time described caching data time and/ Or,
The data of described caching are accessed number of times in preset duration.
Optionally, the caching judgement process carrying out data described in includes:
Described data state info include the data of described caching when previous request time,
Current time in system is more than with the described difference at previous request time of the data of described caching During the requesting interval threshold value preset, determine that the described data of caching are non-hotspot caching data, will determine as Non-hotspot caching data are removed from the cache;And/or,
Described data state info includes the loading number of times of the data of described caching and loads described slow each time During time of the data deposited,
If the number of times that loads of the data of described caching is more than the loading frequency threshold value and/or described caching preset Data load time more than preset load time threshold value, determine the data of described caching be focus but The data that time is controlled, will determine as focus but time controlled data are removed from the cache;And/or,
When described data state info includes the data of described caching accessed number of times in preset duration,
The data of described caching accessed number of times in described preset duration is less than the access times threshold preset Value, determines that described data are non-hotspot caching data, will determine as non-hotspot caching data and deletes from caching Remove.
Optionally, when cached parameters information includes expired time, described method also includes:
Obtain the expired time of the data of described caching, and load from data base and cache described data Handling duration, deducts expired time handling duration and obtains the advanced processing time;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number The number of the renewal that the data of the caching reached with expired time when reaching according to expired time described in storehouse are corresponding According to, and cache.
Optionally, described method also includes:
Data to caching, according to expired time and/or the request that loads time-consuming duration and/or data The data of described caching are ranked up by frequency.
Compared with correlation technique, technical scheme includes: scan service access layer, and identification includes The interface of the automatic load information pre-set;According in default cached parameters information loading of databases with Recognize includes data corresponding to automatic load information interface and caches;The data of the business accessed are During corresponding with the interface including automatic load information data, read the data of caching, it is achieved business is visited Ask process;Cached parameters information includes: the data separation identification information of default settings and/or caching condition Information and/or expired time.Data are believed by the embodiment of the present invention by automatic load information and cached parameters Breath carries out caching process, it is to avoid owing to a large amount of during cache invalidation concurrent operations cause system snowslide;Enter one Step, process by the data of caching are ranked up and cache judgement, improve the profit of data in caching With benefit and data cached reading efficiency.
After reading and understanding accompanying drawing and describing in detail, it can be appreciated that other aspects.
Accompanying drawing explanation
Fig. 1 is the block diagram of the essential electrical structure of the server of the embodiment of the present invention;
Fig. 2 is the flow chart that the embodiment of the present invention loads data cached method;
Fig. 3 is the flow chart that another embodiment of the present invention loads data cached method;
Fig. 4 is the structured flowchart that the embodiment of the present invention loads data cached device;
Fig. 5 is the method flow diagram of application example one of the present invention;
Fig. 6 is the method flow diagram of application example two of the present invention.
Detailed description of the invention
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing Embodiments of the invention are described in detail.It should be noted that in the case of not conflicting, this Shen Embodiment in please and the feature in embodiment can mutual combination in any.
In follow-up description, use such as " module ", " parts " for representing element or " single Unit " suffix only for the explanation of the beneficially present invention, itself do not have specific meaning.Therefore, " Module " with " parts " can mixedly use.
As it is shown in figure 1, be the block diagram of the essential electrical structure of the server of the embodiment of the present invention, including: Input and output (IO) bus, processor 40, memorizer 41, internal memory 42 and communicator 43.Wherein,
Input and output (IO) bus respectively with other parts of self affiliated server (processor 40, Memorizer 41, internal memory 42 and communicator 43) connect, and provide transmission lines for other parts.
Processor 40 generally controls the overall operation of self affiliated server.Such as, processor 40 is held The operations such as row calculating and confirmation.Wherein, processor 40 can be central processing unit (CPU).
Communicator 43, generally includes one or more assembly, and it allows self affiliated server and nothing Radio communication between line communication system or network.
Memorizer 41 stores that processor 40 is readable, the executable software code of processor, its comprise for Control processor 40 and perform the instruction (i.e. software execution function) of functions described herein.
Electrical structure based on above-mentioned server, proposes the embodiment of the inventive method.
Fig. 2 is the flow chart that the embodiment of the present invention loads data cached method, as in figure 2 it is shown, include:
Step 200, scan service access layer, identify and include connecing of the automatic load information that pre-sets Mouthful;
It should be noted that automatically load information can be automatically load labelling, mark or other can be by The similar information that this class interface is distinguished with other interfaces.Automatically load information can be according to art technology Data are judged to be configured by personnel the need of the analysis automatically loaded.It addition, scan service layer Operation be that service access layer has been carried out customary operation.
The cached parameters information loading of databases that step 201, basis are preset includes certainly with recognize Move data corresponding to load information interface and cache;
Here, cached parameters information includes: the data separation identification information of default settings and/or cache bar Part information and/or expired time;Wherein, data separation identification information includes: carry out according to interface parameters Combination and/or docking port parameter use the method function preset to carry out conversion and generate.
Step 202, the data of the business accessed are the number corresponding with the interface including automatic load information According to time, read caching data, it is achieved Operational Visit process;
It should be noted that the method function preset can include Hash, Message Digest Algorithm 5 Etc. (MD5) unique mark, coding or the information of title can be generated;Data separation identification information uses May include that the parameter of interface when interface parameters is combined according to the sort merge rule set, raw Becoming the data corresponding data separation identification information of difference caching, the information that a combination thereof generates can be similar Sequence number information with library preservation books;Interface parameters is used to be combined generating and using docking port The method function that parameter employing is preset carries out budget generation can make the data separation identification information of generation have Certain nomenclature rule, facilitates user personnel being identified data cached being analyzed in processing procedure; The embodiment of the present invention can generate data separation identification information in the way of using other, as long as generate is every One data distinguishing identifier information possesses uniqueness.
Optionally, the data of the business of access are the data corresponding with the interface including automatic load information Time, if not reading the data of the business of access, the embodiment of the present invention method when reading the data of caching Also include: from data base to access business data corresponding to include connecing of automatic load information The data of mouth load and cache.
It should be noted that here, the packet of the business of access is not read when reading the data of caching Include: make a mistake during according to the data of business of embodiment of the present invention read access from the data of caching or Failure, reads error in data or failure can be by the digital independent the most successfully side of judgement in correlation technique Method is implemented.
Optionally, embodiment of the present invention method also includes:
When data occur to update, if the data updated are and include on the interface of automatic load information Data, then update the data corresponding with the business updated comprised in caching.
It should be noted that here, data occur renewal to include that the data of business update, business number According to the systematic parameter that can be related to when being updated by system of renewal be determined, such as, configuration The file content of daily record, system journal, running log and other record data variation judges.Separately Outward, data-updating method can include using depositing of former data corresponding to the data updated in correlation technique Storage path is determined, after the data deletion on the store path of the former data determined, and the industry that will update The data of business carry out loading and writing the operation of caching.
Optionally, embodiment of the present invention method also includes:
Set up the message queue of the data state info of the data comprising all cachings;
According to default taking-up strategy, from the message queue set up, read one or more data Data state info, the caching judgement carrying out data according to the data state info read processes;Here, The taking-up strategy preset includes, reads the data state info of data one by one according to the sequence of message queue.
Data state info is gather in advance, including the information of at least one of:
The data of caching at previous request time and/or,
Caching data load number of times and load each time caching data time and/or,
The data of caching are accessed number of times in preset duration.
It should be noted that the content that data state info comprises can be according to system cache flow, business Access speeds etc. are made whether the reference as data state info, and those skilled in the art can be according to this The use scene of inventive embodiments is added and deletes.
Optionally, the caching judgement process carrying out data includes:
Data state info include caching data when previous request time,
Current time in system please more than default with the difference at previous request time of the data of caching When seeking interval threshold, determine that the data of caching are non-hotspot caching data, will determine as non-hotspot caching number According to being removed from the cache;And/or,
Data state info include caching data load number of times and load each time caching data time Between time,
If the loading number of times of the data of caching is more than the data of the loading frequency threshold value preset and/or caching Load time, more than the load time threshold value preset, determines that the data of caching are focus but time controlled number According to, will determine as focus but time controlled data are removed from the cache;And/or,
When data state info includes the data of caching accessed number of times in preset duration,
The data of caching accessed number of times in preset duration, less than the access times threshold value preset, determines Data are non-hotspot caching data, will determine as non-hotspot caching data and are removed from the cache.
It should be noted that load the data that the data cached time can be cached by loading each time The total time of number of times and the number of times of loading calculate.It addition, the data of caching access in preset duration Number of times may include that the number by recording access cache first less than the judgement of the access times threshold value preset According to time, the current time in system is deducted the data of access cache first time obtain access duration, Statistics accesses duration and reaches the accessed number of times of the data of caching during preset duration, such as, by visiting first Ask the accessed number of times of the data cached in starting later hour, if accessed number of times is less than access time Number threshold value, such as access times threshold value are 60, determine that the data of caching are non-hotspot caching data, will really It is set to non-hotspot caching data to be removed from the cache;If not the data from the caching accessed first time Between start timing, then according to the accessed number of times of data of preset duration statistics caching.
It addition, in the embodiment of the present invention, from message queue, reading cache data information can use multiple The mode of task parallelism is carried out, and can improve, with this, the speed that caching judgement processes.
Optionally, when cached parameters information includes expired time, embodiment of the present invention method also includes:
Obtain the expired time of the data of caching, and load from data base and data cached handling duration, Expired time is deducted handling duration and obtains the advanced processing time;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number According to the data of the data cached corresponding renewal reached with expired time when expired time reaches in storehouse, and Caching.
Optionally, embodiment of the present invention method also includes:
Data to caching, according to expired time and/or the request that loads time-consuming duration and/or data The data of caching are ranked up by frequency.
It should be noted that sort method can be analyzed by those skilled in the art according to system requirements Judging to obtain, if system requirements request response efficiency is fast, then the data that request frequency is high can be buffered in Sort preceding position, it is simple to read efficiently;If loading longer can the impact of time-consuming duration to read other The data of business, then load the longer data of time-consuming duration and can be buffered in sequence position below, it is to avoid Reading on the data affecting other cachings.
Data are carried out at caching by embodiment of the present invention method by automatic load information and caching parameter information Reason, it is to avoid owing to a large amount of during cache invalidation concurrent operations cause system snowslide;Further, by right The data of caching are ranked up and cache judgement and process, and improve utilization benefit and the caching of data in caching The reading efficiency of data.
Fig. 3 is the flow chart that another embodiment of the present invention loads data cached method, as it is shown on figure 3, Including:
Step 300, scan service access layer, identify and include connecing of the automatic load information that pre-sets Mouthful;
It should be noted that automatically load information can be automatically load labelling, mark or other can be by The similar information that this class interface is distinguished with other interfaces.Automatically load information can be according to art technology Data are judged to be configured by personnel the need of the analysis automatically loaded.It addition, scan service layer Operation be that service access layer has been carried out customary operation.
The cached parameters information loading of databases that step 301, basis are preset includes certainly with recognize Move data corresponding to load information interface and cache;
Here, cached parameters information includes: the data separation identification information of default settings and/or cache bar Part information and/or expired time;Wherein, data separation identification information includes: carry out according to interface parameters Combination and/or docking port parameter use the method function preset to carry out conversion and generate.
Step 302, foundation comprise the message queue of the data state info of the data of all cachings, according to The taking-up strategy preset, reads the data mode of one or more data from the message queue set up Information, the caching judgement carrying out data according to the data state info read processes;
Here, data state info is gather in advance, including the information of at least one of:
The data of caching at previous request time and/or,
Caching data load number of times and load each time caching data time and/or,
The data of caching are accessed number of times in preset duration.
It should be noted that the content that data state info comprises can be according to system cache flow, business Access speeds etc. are made whether the reference as data state info, and those skilled in the art can be according to this The use scene of inventive embodiments is added and deletes.
Optionally, the caching judgement process carrying out data includes:
Data state info include caching data when previous request time,
Current time in system please more than default with the difference at previous request time of the data of caching When seeking interval threshold, determine that the data of caching are non-hotspot caching data, will determine as non-hotspot caching number According to being removed from the cache;And/or,
Data state info include caching data load number of times and load each time caching data time Between time,
If the loading number of times of the data of caching is more than the data of the loading frequency threshold value preset and/or caching Load time, more than the load time threshold value preset, determines that the data of caching are focus but time controlled number According to, will determine as focus but time controlled data are removed from the cache;And/or,
When data state info includes the data of caching accessed number of times in preset duration,
The data of caching accessed number of times in preset duration, less than the access times threshold value preset, determines Data are non-hotspot caching data, will determine as non-hotspot caching data and are removed from the cache.
It should be noted that load the data that the data cached time can be cached by loading each time The total time of number of times and the number of times of loading calculate.It addition, the data of caching access in preset duration Number of times may include that the number by recording access cache first less than the judgement of the access times threshold value preset According to time, the current time in system is deducted the data of access cache first time obtain access duration, Statistics accesses duration and reaches the accessed number of times of the data of caching during preset duration, such as, by visiting first Ask the accessed number of times of the data cached in starting later hour, if accessed number of times is less than access time Number threshold value, such as access times threshold value are 60, determine that the data of caching are non-hotspot caching data, will really It is set to non-hotspot caching data to be removed from the cache;If not the data from the caching accessed first time Between start timing, then according to the accessed number of times of data of preset duration statistics caching.
It addition, in the embodiment of the present invention, from message queue, reading cache data information can use multiple The mode of task parallelism is carried out, and can improve, with this, the speed that caching judgement processes.
Step 303, to caching data, according to expired time and/or load time-consuming duration and/or number According to request frequency to caching data be ranked up.
It should be noted that sort method can be analyzed by those skilled in the art according to system requirements Judging to obtain, if system requirements request response efficiency is fast, then the data that request frequency is high can be buffered in Sort preceding position, it is simple to read efficiently;If loading longer can the impact of time-consuming duration to read other The data of business, then load the longer data of time-consuming duration and can be buffered in sequence position below, it is to avoid Reading on the data affecting other cachings.
Step 304, the data of the business accessed are the number corresponding with the interface including automatic load information According to time, read caching data, it is achieved Operational Visit process;
It should be noted that the method function preset can include Hash, Message Digest Algorithm 5 Etc. (MD5) unique mark, coding or the information of title can be generated;Data separation identification information uses May include that the parameter of interface when interface parameters is combined according to the sort merge rule set, raw Becoming the data corresponding data separation identification information of difference caching, the information that a combination thereof generates can be similar Sequence number information with library preservation books;Interface parameters is used to be combined generating and using docking port The method function that parameter employing is preset carries out budget generation can make the data separation identification information of generation have Certain nomenclature rule, facilitates user personnel being identified data cached being analyzed in processing procedure; The embodiment of the present invention can generate data separation identification information in the way of using other, as long as generate is every One data distinguishing identifier information possesses uniqueness.
Step 305, the data of the business accessed are the number corresponding with the interface including automatic load information According to time, if not reading the data of business of access when reading the data of caching, to visit from data base The data of the interface including automatic load information corresponding to the data of the business asked load and delay Deposit.
It should be noted that here, the packet of the business of access is not read when reading the data of caching Include: make a mistake during according to the data of business of embodiment of the present invention read access from the data of caching or Failure, reads error in data or failure can be by the digital independent the most successfully side of judgement in correlation technique Method is implemented.
Optionally, when cached parameters information includes expired time, embodiment of the present invention method also includes:
Obtain the expired time of the data of caching, and load from data base and data cached handling duration, Expired time is deducted handling duration and obtains the advanced processing time;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number According to the data of the data cached corresponding renewal reached with expired time when expired time reaches in storehouse, and Caching.
Optionally, embodiment of the present invention method also includes:
When data occur to update, if the data updated are and include on the interface of automatic load information Data, then update the data corresponding with the business updated comprised in caching.
It should be noted that here, data occur renewal to include that the data of business update, business number According to the systematic parameter that can be related to when being updated by system of renewal be determined, such as, configuration The file content of daily record, system journal, running log and other record data variation judges.Separately Outward, data-updating method can include using depositing of former data corresponding to the data updated in correlation technique Storage path is determined, after the data deletion on the store path of the former data determined, and the industry that will update The data of business carry out loading and writing the operation of caching.
Fig. 4 is the structured flowchart that the embodiment of the present invention loads data cached device, as shown in Figure 4, and bag Include: recognition unit, buffer unit and reading unit;Wherein,
Recognition unit is used for, scan service access layer, identifies and includes the automatic load information pre-set Interface;
Buffer unit is used for, according in default cached parameters information loading of databases with comprising of recognizing There are data that automatic load information interface is corresponding and cache;
Optionally, buffer unit is additionally operable to, and the data of the business of access are and include automatic load information Data corresponding to interface time, if not reading the data of the business of access when reading the data of caching, To the data of the interface including automatic load information corresponding to the data of the business accessed from data base Load and cache.
Here, the data of the business not reading access when reading the data of caching include: according to the present invention Make a mistake or failure during the data of business of embodiment read access from the data of caching, read data Mistake or failure can be implemented by the most successfully determination methods of digital independent in correlation technique.
Optionally, buffer unit is additionally operable to,
When cached parameters information includes expired time, obtain the expired time of the data of caching, and from number Load according to storehouse and data cached handling duration, when expired time is deducted handling duration acquisition advanced processing Between;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number According to the data of the data cached corresponding renewal reached with expired time when expired time reaches in storehouse, and Caching.
Reading unit is used for, and the data of the business of access are corresponding with the interface including automatic load information Data time, read caching data, it is achieved Operational Visit process;
Cached parameters information includes: the data separation identification information of default settings and/or caching conditional information, And/or expired time;
Data separation identification information includes: be combined according to interface parameters and/or docking port parameter uses The method function preset carries out conversion and generates.
It should be noted that automatically load information can be automatically load labelling, mark or other can be by The similar information that this class interface is distinguished with other interfaces.Automatically load information can be according to art technology Data are judged to be configured by personnel the need of the analysis automatically loaded.It addition, the method preset Function can include that Hash, Message Digest Algorithm 5 (MD5) etc. can generate unique mark, compile Code or the information of title;Data separation identification information use interface parameters may include that when being combined by The parameter of interface, according to the sort merge rule set, generates the corresponding data separation of data of difference caching Identification information, the information that a combination thereof generates can be the similar sequence number information with library preservation books; Interface parameters is used to be combined generating and use docking port parameter to use the method function preset to carry out budget Generation can make the data separation identification information of generation have certain nomenclature rule, facilitates user personnel to exist Data cached being analyzed in processing procedure is identified;The embodiment of the present invention can also use other Mode generates data separation identification information, as long as each the data separation identification information generated possesses uniquely Property.
Optionally, embodiment of the present invention device also includes updating block,
Updating block is used for, when data occur to update, if the data updated are and include loading automatically Data on the interface of information, then update the data corresponding with the business updated comprised in caching.
It should be noted that here, data occur renewal to include that the data of business update, business number According to the systematic parameter that can be related to when being updated by system of renewal be determined, such as, configuration The file content of daily record, system journal, running log and other record data variation judges.Separately Outward, data-updating method can include using depositing of former data corresponding to the data updated in correlation technique Storage path is determined, after the data deletion on the store path of the former data determined, and the industry that will update The data of business carry out loading and writing the operation of caching.
Optionally, embodiment of the present invention device also includes caching process unit, comprises all slow for foundation The message queue of the data state info of the data deposited;
According to default taking-up strategy, from the message queue set up, read one or more data Data state info, the caching judgement carrying out data according to the data state info read processes;
Data state info is gather in advance, including the information of at least one of:
The data of caching at previous request time and/or,
Caching data load number of times and load each time caching data time and/or,
The data of caching are accessed number of times in preset duration.
It should be noted that the content that data state info comprises can be according to system cache flow, business Access speeds etc. are made whether the reference as data state info, and those skilled in the art can be according to this The use scene of inventive embodiments is added and deletes.
Optionally, caching process unit specifically for,
Set up the message queue of the data state info of the data comprising all cachings;
According to default taking-up strategy, from the message queue set up, read one or more data Data state info;
Read data state info include caching data when previous request time,
Current time in system please more than default with the difference at previous request time of the data of caching When seeking interval threshold, determine that the data of caching are non-hotspot caching data, will determine as non-hotspot caching number According to being removed from the cache;And/or,
The data state info read includes loading number of times and loading the number of caching each time of the data of caching According to time time,
If the loading number of times of the data of caching is more than the data of the loading frequency threshold value preset and/or caching Load time, more than the load time threshold value preset, determines that the data of caching are focus but time controlled number According to, will determine as focus but time controlled data are removed from the cache;And/or,
When the data state info read includes the data of caching accessed number of times in preset duration,
The data of caching accessed number of times in preset duration, less than the access times threshold value preset, determines Data are non-hotspot caching data, will determine as non-hotspot caching data and are removed from the cache.
It should be noted that load the data that the data cached time can be cached by loading each time The total time of number of times and the number of times of loading calculate.It addition, the data of caching access in preset duration Number of times may include that the number by recording access cache first less than the judgement of the access times threshold value preset According to time, the current time in system is deducted the data of access cache first time obtain access duration, Statistics accesses duration and reaches the accessed number of times of the data of caching during preset duration, such as, by visiting first Ask the accessed number of times of the data cached in starting later hour, if accessed number of times is less than access time Number threshold value, such as access times threshold value are 60, determine that the data of caching are non-hotspot caching data, will really It is set to non-hotspot caching data to be removed from the cache;If not the data from the caching accessed first time Between start timing, then according to the accessed number of times of data of preset duration statistics caching.
It addition, in the embodiment of the present invention, from message queue, reading cache data information can use multiple The mode of task parallelism is carried out, and can improve, with this, the speed that caching judgement processes.
Embodiment of the present invention device also includes sequencing unit,
Sequencing unit is used for, to caching data, according to expired time and/or load time-consuming duration and / or data request frequency to caching data be ranked up.
It should be noted that sort method can be analyzed by those skilled in the art according to system requirements Judging to obtain, if system requirements request response efficiency is fast, then the data that request frequency is high can be buffered in Sort preceding position, it is simple to read efficiently;If loading longer can the impact of time-consuming duration to read other The data of business, then load the longer data of time-consuming duration and can be buffered in sequence position below, it is to avoid Reading on the data affecting other cachings.
Embodiment of the present invention device can be arranged on server and be operated, it is also possible to by with server It is operated after being communicatively coupled.
Another embodiment of the present invention loads data cached device, including: recognition unit, buffer unit, Read unit, updating block, caching process unit and sequencing unit;Wherein,
Recognition unit is used for, scan service access layer, identifies and includes the automatic load information pre-set Interface;
Buffer unit is used for, according in default cached parameters information loading of databases with comprising of recognizing There are data that automatic load information interface is corresponding and cache;
Reading unit is used for, and the data of the business of access are corresponding with the interface including automatic load information Data time, read caching data, it is achieved Operational Visit process;
Cached parameters information includes: the data separation identification information of default settings and/or caching conditional information, And/or expired time;
Data separation identification information includes: be combined according to interface parameters and/or docking port parameter uses The method function preset carries out conversion and generates.
It should be noted that automatically load information can be automatically load labelling, mark or other can be by The similar information that this class interface is distinguished with other interfaces.Automatically load information can be according to art technology Data are judged to be configured by personnel the need of the analysis automatically loaded.It addition, the method preset Function can include that Hash, Message Digest Algorithm 5 (MD5) etc. can generate unique mark, compile Code or the information of title;Data separation identification information use interface parameters may include that when being combined by The parameter of interface, according to the sort merge rule set, generates the corresponding data separation of data of difference caching Identification information, the information that a combination thereof generates can be the similar sequence number information with library preservation books; Interface parameters is used to be combined generating and use docking port parameter to use the method function preset to carry out budget Generation can make the data separation identification information of generation have certain nomenclature rule, facilitates user personnel to exist Data cached being analyzed in processing procedure is identified;The embodiment of the present invention can also use other Mode generates data separation identification information, as long as each the data separation identification information generated possesses uniquely Property.
Updating block is used for, when data occur to update, if the data updated are and include loading automatically Data on the interface of information, then update the data corresponding with the business updated comprised in caching.
It should be noted that here, data occur renewal to include that the data of business update, business number According to the systematic parameter that can be related to when being updated by system of renewal be determined, such as, configuration The file content of daily record, system journal, running log and other record data variation judges.Separately Outward, data-updating method can include using depositing of former data corresponding to the data updated in correlation technique Storage path is determined, after the data deletion on the store path of the former data determined, and the industry that will update The data of business carry out loading and writing the operation of caching.
Caching process unit, for setting up the message team of the data state info of the data comprising all cachings Row;
According to default taking-up strategy, from the message queue set up, read one or more data Data state info, the caching judgement carrying out data according to the data state info read processes;
Data state info is gather in advance, including the information of at least one of:
The data of caching at previous request time and/or,
Caching data load number of times and load each time caching data time and/or,
The data of caching are accessed number of times in preset duration.
It should be noted that the content that data state info comprises can be according to system cache flow, business Access speeds etc. are made whether the reference as data state info, and those skilled in the art can be according to this The use scene of inventive embodiments is added and deletes.
Optionally, caching process unit specifically for,
Set up the message queue of the data state info of the data comprising all cachings;
According to default taking-up strategy, from the message queue set up, read one or more data Data state info;
Read data state info include caching data when previous request time,
Current time in system please more than default with the difference at previous request time of the data of caching When seeking interval threshold, determine that the data of caching are non-hotspot caching data, will determine as non-hotspot caching number According to being removed from the cache;And/or,
The data state info read includes loading number of times and loading the number of caching each time of the data of caching According to time time,
If the loading number of times of the data of caching is more than the data of the loading frequency threshold value preset and/or caching Load time, more than the load time threshold value preset, determines that the data of caching are focus but time controlled number According to, will determine as focus but time controlled data are removed from the cache;And/or,
When the data state info read includes the data of caching accessed number of times in preset duration,
The data of caching accessed number of times in preset duration, less than the access times threshold value preset, determines Data are non-hotspot caching data, will determine as non-hotspot caching data and are removed from the cache.
It should be noted that load the data that the data cached time can be cached by loading each time The total time of number of times and the number of times of loading calculate.It addition, the data of caching access in preset duration Number of times may include that the number by recording access cache first less than the judgement of the access times threshold value preset According to time, the current time in system is deducted the data of access cache first time obtain access duration, Statistics accesses duration and reaches the accessed number of times of the data of caching during preset duration, such as, by visiting first Ask the accessed number of times of the data cached in starting later hour, if accessed number of times is less than access time Number threshold value, such as access times threshold value are 60, determine that the data of caching are non-hotspot caching data, will really It is set to non-hotspot caching data to be removed from the cache;If not the data from the caching accessed first time Between start timing, then according to the accessed number of times of data of preset duration statistics caching.
It addition, in the embodiment of the present invention, from message queue, reading cache data information can use multiple The mode of task parallelism is carried out, and can improve, with this, the speed that caching judgement processes.
Sequencing unit is used for, to caching data, according to expired time and/or load time-consuming duration and / or data request frequency to caching data be ranked up.
It should be noted that sort method can be analyzed by those skilled in the art according to system requirements Judging to obtain, if system requirements request response efficiency is fast, then the data that request frequency is high can be buffered in Sort preceding position, it is simple to read efficiently;If loading longer can the impact of time-consuming duration to read other The data of business, then load the longer data of time-consuming duration and can be buffered in sequence position below, it is to avoid Reading on the data affecting other cachings.
Optionally, buffer unit is additionally operable to, and the data of the business of access are and include automatic load information Data corresponding to interface time, if not reading the data of the business of access when reading the data of caching, To the data of the interface including automatic load information corresponding to the data of the business accessed from data base Load and cache.
Here, the data of the business not reading access when reading the data of caching include: according to the present invention Make a mistake or failure during the data of business of embodiment read access from the data of caching, read data Mistake or failure can be implemented by the most successfully determination methods of digital independent in correlation technique.
Optionally, buffer unit is additionally operable to,
When cached parameters information includes expired time, obtain the expired time of the data of caching, and from number Load according to storehouse and data cached handling duration, when expired time is deducted handling duration acquisition advanced processing Between;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number According to the data of the data cached corresponding renewal reached with expired time when expired time reaches in storehouse, and Caching.
Embodiment of the present invention device can be arranged on server and be operated, it is also possible to by with server It is operated after being communicatively coupled.
Carry out understanding that detailed description, application example are only used for the inventive method below by way of application example The statement embodiment of the present invention, the protection domain being not intended to limit the present invention.
Application example 1
Fig. 5 is the method flow diagram of application example one of the present invention, as it is shown in figure 5, include:
Step 500, scan service access layer, read and have the interface arranging automatic load information, by certainly Dynamic load information determines whether automatic loading data;The automatic load information of this application example can include automatically Load labelling;
The cached parameters information loading of databases that step 501, basis are preset includes certainly with recognize Move data corresponding to load information interface and cache;Here, the data of caching can be written to caching center;
This application example, cached parameters information includes: the data separation identification information of default settings and/ Or cache conditional information and/or expired time;Wherein, data separation identification information includes: according to interface Parameter is combined and/or docking port parameter uses the method function preset to carry out conversion generation.
Step 502, the business datum accessed are the data corresponding with the interface including automatic load information Time, it may be judged whether read the data of caching;If reading the data of caching, perform step 5030; If not reading the data of caching, perform step 5040;
Step 5030, the data of reading caching, it is achieved Operational Visit processes;
Step 5040, from data base to including and automatically add information carrying corresponding to the data of the business accessed The data of the interface of breath load and cache;After performing step 5040, application example can be according to slow The data deposited realize Operational Visit and process, and i.e. can perform step 5030.
This application example, after completing the caching of data, application example can return the result set of data.
Step 5031, when business more new data, update caching in comprise with update business corresponding Data.
Business more new data can include that back-stage management have also been made amendment etc. to product details, such as, and configuration Parameter modification, product type amendment etc.;
It should be noted that during buffer update, the data of renewal according to correlation technique processing method together with Step is written in data base update accordingly.
Application example 2
This application example is firstly the need of the message team of the data state info setting up the data comprising all cachings Row;Data state info is gather in advance, including: the data of caching previous request time, Caching data load number of times and load each time caching data time and/or caching data exist Accessed number of times in preset duration.This application example reads first in message queue with a process every time Illustrate as a example by item data status information;Fig. 6 is the method flow diagram of application example two of the present invention, as Shown in Fig. 6, including:
Step 600, from message queue read current queue first data state info;
Step 601, obtain from data state info caching data at previous request time, Read present system time;
It should be noted that the information such as request time and present system time can use in correlation technique Acquisition method is implemented.
Step 602, current time in system are big with the difference at previous request time of the data of caching When default requesting interval threshold value, determine that the data of caching are non-hotspot caching data;
Requesting interval threshold value can be analyzed setting according to parameters such as Operational Visit timeliness demand and systematic functions Fixed.
Step 603, will determine as non-hotspot caching data and be removed from the cache;
Step 604, from data state info, obtain accessed in preset duration of the data of caching Number of times;
Step 605, the data cached accessed number of times in preset duration are less than the access times preset Threshold value, determines that data are non-hotspot caching data;Non-hotspot caching data are located according to step 603 Reason.
This application example preset duration can be 1 hour, access times threshold value 60 times;
Step 606, from data state info, obtain the loading number of times of the data of caching and load each time The time of the data of caching;
If the number of times that loads of the data of step 607 caching more than the loading frequency threshold value preset and/or delays The load time of the data deposited more than preset load time threshold value, determine the data of caching be focus but time Between controlled data;
This application example loads frequency threshold value can be with 100, and load time threshold value can include 10~100 milliseconds; Actual numerical value can be determined according to number of request;
Step 608, will determine as focus but time controlled data are removed from the cache;
During it should be noted that perform above-mentioned steps, data state info enters according to the content performing step Row updates.
This application example also includes: to caching data, according to expired time and/or load time-consuming duration, And/or the data of caching are ranked up by the request frequency of data.
It should be noted that sequence process can be carried out, according to the plan arranged by starting single process Slightly being ranked up message queue, sort algorithm may include that closer to expired time and/or the most time-consuming Data cached, sequence more before.Carrying out Bit-reversed according to request number of times, request number of times is the most, explanation Use frequency the highest, cause concurrent may be the biggest.
One of ordinary skill in the art will appreciate that all or part of step in said method can pass through program Instructing related hardware (such as processor) to complete, described program can be stored in computer-readable storage In medium, such as read only memory, disk or CD etc..Alternatively, above-described embodiment is all or part of Step can also use one or more integrated circuit to realize.Correspondingly, each in above-described embodiment Module/unit can realize to use the form of hardware, such as, realize its corresponding function by integrated circuit, The form that can also use software function module realizes, such as, perform to be stored in memorizer by processor Program/instruction realize its corresponding function.The present invention is not restricted to the hardware and software of any particular form Combination.”.
Although the embodiment that disclosed herein is as above, but described content only readily appreciates the present invention And the embodiment used, it is not limited to the present invention.Technology people in any art of the present invention Member, on the premise of without departing from the spirit and scope that disclosed herein, can be in the form implemented and thin Any amendment and change is carried out on joint, but the scope of patent protection of the present invention, still must be with appended right Claim is defined in the range of standard.

Claims (10)

1. one kind loads data cached device, it is characterised in that including: recognition unit, buffer unit With reading unit;Wherein,
Recognition unit is used for, scan service access layer, identifies and includes the automatic load information pre-set Interface;
Buffer unit is used for, according in default cached parameters information loading of databases with comprising of recognizing There are data that automatic load information interface is corresponding and cache;
Reading unit is used for, and the data of the business of access are corresponding with the interface including automatic load information Data time, read caching data, it is achieved Operational Visit process;
Described cached parameters information includes: the data separation identification information of default settings and/or caching condition Information and/or expired time;
Described data separation identification information includes: be combined according to interface parameters and/or docking port parameter Use the method function preset to carry out conversion to generate.
Device the most according to claim 1, it is characterised in that described buffer unit is additionally operable to, institute When the data of the business stating access are the data corresponding with the interface including automatic load information, if read The data of the business of described access are not read, to the business accessed from data base when taking the data of caching The data of the interface including automatic load information corresponding to data load and cache.
Device the most according to claim 1, it is characterised in that described device also includes updating block,
Updating block is used for, data occur update time, if update data for include described automatically Data on the interface of load information, then update the data corresponding with the business updated comprised in caching.
4. according to the device described in any one of claims 1 to 3, it is characterised in that described device also wraps Include caching process unit, for setting up the message of the data state info of the described data comprising all cachings Queue;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info, according to read data state info carry out data caching judgement process;
Described data state info is gather in advance, including the information of at least one of:
The data of described caching at previous request time and/or,
The data of described caching load number of times and load each time described caching data time and/ Or,
The data of described caching are accessed number of times in preset duration.
Device the most according to claim 4, it is characterised in that described caching process unit is specifically used In,
Set up the message queue of the data state info of the described data comprising all cachings;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info;
Read described data state info include described caching data at previous request time Time,
Current time in system is more than with the described difference at previous request time of the data of described caching During the requesting interval threshold value preset, determine that the described data of caching are non-hotspot caching data, will determine as Non-hotspot caching data are removed from the cache;And/or,
The described data state info read includes the loading number of times of the data of described caching and loads each time During time of the data of described caching,
If the number of times that loads of the data of described caching is more than the loading frequency threshold value and/or described caching preset Data load time more than preset load time threshold value, determine the data of described caching be focus but The data that time is controlled, will determine as focus but time controlled data are removed from the cache;And/or,
The described data state info read includes accessed time in preset duration of the data of described caching During number,
The data of described caching accessed number of times in described preset duration is less than the access times threshold preset Value, determines that described data are non-hotspot caching data, will determine as non-hotspot caching data and deletes from caching Remove.
6. according to the device described in any one of claims 1 to 3, it is characterised in that described buffer unit It is additionally operable to,
When cached parameters information includes expired time, obtain the expired time of the data of described caching, and Load and cache the handling duration of described data from data base, expired time is deducted handling duration acquisition and carries The pre-treatment time;
In the advanced processing time obtained, if the data not carrying out business load, then load and cache number The number of the data cached corresponding renewal reached with expired time when reaching according to expired time described in storehouse According to, and cache.
7. according to the device described in any one of claims 1 to 3, it is characterised in that described device also wraps Include sequencing unit,
Sequencing unit is used for, the data to described caching, according to expired time and/or load time-consuming duration, And/or the data of caching are ranked up by the request frequency of data.
8. one kind loads data cached method, it is characterised in that including:
Scan service access layer, identifies the interface including the automatic load information pre-set;
According in default cached parameters information loading of databases with recognize include automatic load information Data that interface is corresponding also cache;
When the data of the business accessed are the data corresponding with the interface including automatic load information, read The data of caching, it is achieved Operational Visit processes;
Described cached parameters information includes: the data separation identification information of default settings and/or caching condition Information and/or expired time;
Described data separation identification information includes: be combined according to interface parameters and/or docking port parameter Use the method function preset to carry out conversion to generate.
Method the most according to claim 8, it is characterised in that described method also includes:
Set up the message queue of the data state info of the described data comprising all cachings;
According to default taking-up strategy, from the described message queue set up, read one or more numbers According to data state info, according to read data state info carry out data caching judgement process;
Described data state info is gather in advance, including the information of at least one of:
The data of described caching at previous request time and/or,
The data of described caching load number of times and load each time described caching data time and/ Or,
The data of described caching are accessed number of times in preset duration.
Method the most according to claim 9, it is characterised in that described in carry out the caching of data and sentence Disconnected process includes:
Described data state info include the data of described caching when previous request time,
Current time in system is more than with the described difference at previous request time of the data of described caching During the requesting interval threshold value preset, determine that the described data of caching are non-hotspot caching data, will determine as Non-hotspot caching data are removed from the cache;And/or,
Described data state info includes the loading number of times of the data of described caching and loads described slow each time During time of the data deposited,
If the number of times that loads of the data of described caching is more than the loading frequency threshold value and/or described caching preset Data load time more than preset load time threshold value, determine the data of described caching be focus but The data that time is controlled, will determine as focus but time controlled data are removed from the cache;And/or,
When described data state info includes the data of described caching accessed number of times in preset duration,
The data of described caching accessed number of times in described preset duration is less than the access times threshold preset Value, determines that described data are non-hotspot caching data, will determine as non-hotspot caching data and deletes from caching Remove.
CN201610324104.7A 2016-05-16 2016-05-16 It is a kind of to load data cached method and device Active CN106021445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610324104.7A CN106021445B (en) 2016-05-16 2016-05-16 It is a kind of to load data cached method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610324104.7A CN106021445B (en) 2016-05-16 2016-05-16 It is a kind of to load data cached method and device

Publications (2)

Publication Number Publication Date
CN106021445A true CN106021445A (en) 2016-10-12
CN106021445B CN106021445B (en) 2019-10-15

Family

ID=57097977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610324104.7A Active CN106021445B (en) 2016-05-16 2016-05-16 It is a kind of to load data cached method and device

Country Status (1)

Country Link
CN (1) CN106021445B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815287A (en) * 2016-12-06 2017-06-09 ***股份有限公司 A kind of buffer memory management method and device
CN106843769A (en) * 2017-01-23 2017-06-13 北京齐尔布莱特科技有限公司 A kind of interface data caching method, device and computing device
CN106874124A (en) * 2017-03-30 2017-06-20 光科技股份有限公司 A kind of object-oriented power information acquisition terminal based on the quick loading techniques of SQLite
CN107463598A (en) * 2017-06-09 2017-12-12 中国邮政储蓄银行股份有限公司 Distributed cache system
CN108829743A (en) * 2018-05-24 2018-11-16 平安科技(深圳)有限公司 Data cached update method, device, computer equipment and storage medium
WO2019019382A1 (en) * 2017-07-27 2019-01-31 上海壹账通金融科技有限公司 Cache handling method and device, computer device and storage medium
CN109471875A (en) * 2018-09-25 2019-03-15 网宿科技股份有限公司 Based on data cached temperature management method, server and storage medium
CN109597915A (en) * 2018-09-18 2019-04-09 北京微播视界科技有限公司 Access request treating method and apparatus
CN110109956A (en) * 2019-03-21 2019-08-09 福建天泉教育科技有限公司 A kind of method and terminal for preventing caching from penetrating
CN110555744A (en) * 2018-05-31 2019-12-10 阿里巴巴集团控股有限公司 Service data processing method and system
CN110895474A (en) * 2018-08-24 2020-03-20 深圳市鸿合创新信息技术有限责任公司 Terminal micro-service device and method and electronic equipment
CN111984889A (en) * 2020-02-21 2020-11-24 广东三维家信息科技有限公司 Caching method and system
CN112115074A (en) * 2020-09-02 2020-12-22 紫光云(南京)数字技术有限公司 Method for realizing data resident memory by using automatic loading mechanism
CN112559572A (en) * 2020-12-22 2021-03-26 上海悦易网络信息技术有限公司 Method and equipment for preheating data cache of Key-Value cache system
WO2021244067A1 (en) * 2020-06-05 2021-12-09 苏州浪潮智能科技有限公司 Method for diluting cache space, and device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102170479A (en) * 2011-05-21 2011-08-31 成都市华为赛门铁克科技有限公司 Updating method of Web buffer and updating device of Web buffer
CN103488581A (en) * 2013-09-04 2014-01-01 用友软件股份有限公司 Data caching system and data caching method
WO2014123127A1 (en) * 2013-02-06 2014-08-14 Square Enix Holdings Co., Ltd. Image processing apparatus, method of controlling the same, program and storage medium
CN105302493A (en) * 2015-11-19 2016-02-03 浪潮(北京)电子信息产业有限公司 Swap-in and swap-out control method and system for SSD cache in mixed storage array

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102170479A (en) * 2011-05-21 2011-08-31 成都市华为赛门铁克科技有限公司 Updating method of Web buffer and updating device of Web buffer
WO2014123127A1 (en) * 2013-02-06 2014-08-14 Square Enix Holdings Co., Ltd. Image processing apparatus, method of controlling the same, program and storage medium
CN103488581A (en) * 2013-09-04 2014-01-01 用友软件股份有限公司 Data caching system and data caching method
CN105302493A (en) * 2015-11-19 2016-02-03 浪潮(北京)电子信息产业有限公司 Swap-in and swap-out control method and system for SSD cache in mixed storage array

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815287A (en) * 2016-12-06 2017-06-09 ***股份有限公司 A kind of buffer memory management method and device
CN106843769B (en) * 2017-01-23 2019-08-02 北京齐尔布莱特科技有限公司 A kind of interface data caching method, device and calculate equipment
CN106843769A (en) * 2017-01-23 2017-06-13 北京齐尔布莱特科技有限公司 A kind of interface data caching method, device and computing device
CN106874124A (en) * 2017-03-30 2017-06-20 光科技股份有限公司 A kind of object-oriented power information acquisition terminal based on the quick loading techniques of SQLite
CN106874124B (en) * 2017-03-30 2023-04-14 光一科技股份有限公司 SQLite rapid loading technology-based object-oriented electricity utilization information acquisition terminal
CN107463598A (en) * 2017-06-09 2017-12-12 中国邮政储蓄银行股份有限公司 Distributed cache system
WO2019019382A1 (en) * 2017-07-27 2019-01-31 上海壹账通金融科技有限公司 Cache handling method and device, computer device and storage medium
WO2019223137A1 (en) * 2018-05-24 2019-11-28 平安科技(深圳)有限公司 Cache data update method and apparatus, computer device, and storage medium
CN108829743A (en) * 2018-05-24 2018-11-16 平安科技(深圳)有限公司 Data cached update method, device, computer equipment and storage medium
CN110555744A (en) * 2018-05-31 2019-12-10 阿里巴巴集团控股有限公司 Service data processing method and system
CN110895474A (en) * 2018-08-24 2020-03-20 深圳市鸿合创新信息技术有限责任公司 Terminal micro-service device and method and electronic equipment
CN109597915A (en) * 2018-09-18 2019-04-09 北京微播视界科技有限公司 Access request treating method and apparatus
CN109597915B (en) * 2018-09-18 2022-03-01 北京微播视界科技有限公司 Access request processing method and device
CN109471875A (en) * 2018-09-25 2019-03-15 网宿科技股份有限公司 Based on data cached temperature management method, server and storage medium
CN109471875B (en) * 2018-09-25 2021-08-20 网宿科技股份有限公司 Hot degree management method based on cache data, server and storage medium
CN110109956A (en) * 2019-03-21 2019-08-09 福建天泉教育科技有限公司 A kind of method and terminal for preventing caching from penetrating
CN111984889A (en) * 2020-02-21 2020-11-24 广东三维家信息科技有限公司 Caching method and system
WO2021244067A1 (en) * 2020-06-05 2021-12-09 苏州浪潮智能科技有限公司 Method for diluting cache space, and device and medium
US11687271B1 (en) 2020-06-05 2023-06-27 Inspur Suzhou Intelligent Technology Co., Ltd. Method for diluting cache space, and device and medium
CN112115074A (en) * 2020-09-02 2020-12-22 紫光云(南京)数字技术有限公司 Method for realizing data resident memory by using automatic loading mechanism
CN112559572A (en) * 2020-12-22 2021-03-26 上海悦易网络信息技术有限公司 Method and equipment for preheating data cache of Key-Value cache system

Also Published As

Publication number Publication date
CN106021445B (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN106021445A (en) Cached data loading method and apparatus
CN107391653B (en) Distributed NewSQL database system and picture data storage method
CN102667772B (en) File level hierarchical storage management system, method, and apparatus
CN102426594B (en) Method and system for operating database
CN101493826B (en) Database system based on WEB application and data management method thereof
US9304966B2 (en) Providing local access to managed content
CN107391758B (en) Database switching method, device and equipment
CN102523285B (en) Storage caching method of object-based distributed file system
CN111475757A (en) Page updating method and device
CN103390041A (en) Method and system for providing data service based on middleware
CN105912428B (en) Realize that source data is converted into the system and method for virtual machine image in real time
CN111901294A (en) Method for constructing online machine learning project and machine learning system
WO2016115957A1 (en) Method and device for accelerating computers and intelligent devices for users and applications
US20120284231A1 (en) Distributed, asynchronous and fault-tolerant storage system
WO2014155553A1 (en) Information processing method for distributed processing, information processing device and program, and distributed processing system
CN104104717A (en) Inputting channel data statistical method and device
US11507277B2 (en) Key value store using progress verification
CN104133783B (en) Method and device for processing distributed cache data
CN106021566A (en) Method, device and system for improving concurrent processing capacity of single database
CN107368608A (en) The HDFS small documents buffer memory management methods of algorithm are replaced based on ARC
CN104778271A (en) Video data caching method and device
CN110287152A (en) A kind of method and relevant apparatus of data management
CN104461929B (en) Distributed data cache method based on blocker
CN110008197A (en) A kind of data processing method, system and electronic equipment and storage medium
US11080239B2 (en) Key value store using generation markers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant