WO2013004268A1 - Caching content - Google Patents

Caching content Download PDF

Info

Publication number
WO2013004268A1
WO2013004268A1 PCT/EP2011/061116 EP2011061116W WO2013004268A1 WO 2013004268 A1 WO2013004268 A1 WO 2013004268A1 EP 2011061116 W EP2011061116 W EP 2011061116W WO 2013004268 A1 WO2013004268 A1 WO 2013004268A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
cache
proactively
cached
register
Prior art date
Application number
PCT/EP2011/061116
Other languages
French (fr)
Inventor
Janne Einari TUONONEN
Ville Petteri POYHONEN
Ove Bjorn STRANDBERG
Hannu Flinck
Original Assignee
Nokia Siemens Networks Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Siemens Networks Oy filed Critical Nokia Siemens Networks Oy
Priority to PCT/EP2011/061116 priority Critical patent/WO2013004268A1/en
Publication of WO2013004268A1 publication Critical patent/WO2013004268A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Definitions

  • the present invention relates to caching content and, in particular, to proactive caching of content in network architectures.
  • Content distribution and delivery is one of the main uses and purposes of many network architectures, in particular, the Internet.
  • Most network architectures, including the Internet are based on an "End-to-End" Communication paradigm where
  • a first fixed point of a user device e.g. personal computer
  • requests content from a second fixed point of a server and receives the requested content from the server by a particular route or path through the network.
  • the present invention seeks to address, at least in part, some or all of the drawbacks and disadvantages described above. Furthermore, the present invention seeks to address, at least in part, the need to enable the implementation of information centric and/or publish/subscribe paradigms on existing network architectures that are based on end-to- end communication mechanisms.
  • a method comprising the steps of: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies; and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
  • the method may be implemented by a computing device, a server that may be a separate entity.
  • the method may be implemented as a function or module either separate to or integrated with another system.
  • the requests for content where the content may be any content (e.g. web page, information, applications, resources, data, subscriptions, and so on) in a network, such as the Internet, are monitored.
  • content e.g. web page, information, applications, resources, data, subscriptions, and so on
  • a network such as the Internet
  • the requests for content may be from user equipment, for example, mobile devices (e.g. mobile phones, tablets, laptops, etc), from network operator servers, other party servers or equipment and so on. Based on one or more policies it is determined whether to proactively cache the content in the request that is monitored. Proactively caching relates to making a proactive decision to cache content. Thus, instead of a cache having to be located on the delivery path and being reactive by taking content that is being delivered to an end user, the method decides whether to cache content based on the request for content and then instructs a cache to proactively seek and obtain the content.
  • mobile devices e.g. mobile phones, tablets, laptops, etc
  • policies may include filters, rules, conditions, and so on, may be applied in order to determine whether to proactively cache the content.
  • the policy may be to cache everything.
  • Policies may be to cache everything from one content provider but not from another content provider.
  • Policies may be to cache content once there have been a predefined number of requests for the content.
  • Policies may relate to content providers (e.g. stating which content providers wish their content to be cached, where or by whom the content may be cached etc), relate to network operators or relate to users/subscribers (e.g. the identity of the user).
  • Policies may include whitelists and/or blacklists in relation to content, content providers, users/subscribers and so on.
  • the policies may also define actions to be taken if certain content is to be cached, the request comes from a certain user/subscriber, and so on.
  • policies may be defined and applied to determine whether or not to proactively cache the content. Also the policies may define any description of when, how or why or what to cache. If it is determined to proactively cache the content then an instruction message may be transmitted to one or more caches. The instruction message may indicate the content to cache. The instruction message may indicate the location (e.g. address) of the content, for example, the location may be of the content provider or of another cache or a combination thereof. The instruction message may be sent to more than one cache to ensure that more than one caches cache the content to enable efficient and local delivery of the content to users.
  • the instruction message may be sent to more than one cache to ensure that more than one caches cache the content to enable efficient and local delivery of the content to users.
  • the step of monitoring one or more requests for content may further comprise the step of analysing each request for content to identify content being requested.
  • the method may monitor the requests by, for example, interpreting the request protocol to detect Identifiers of the content.
  • the monitoring may be implemented using Deep Packet Inspection or via a Uniform Resource Locator (URL) parser mechanism to identify the content.
  • the method may further comprise the steps of receiving the one or more first policies; and storing the one or more first policies. Any party may be able to define and store a policy to be applied when determining whether to proactively cache the content. The policies that are defined may be stored and maintained.
  • the policies may be updated or changed by any party (e.g. content provider, network operator, user/subscriber, etc). Therefore, the policies may be adapted, changed or altered depending on the needs or wishes of a party.
  • the policies may be updated or changed depending on the network conditions, e.g. if the network is heavily loaded then more content may be cached to distribute the load more efficiently and effectively.
  • the method may further comprise receiving an updated policy and storing the updated policy.
  • the method may further comprise the steps of identifying an address of one or more caches based on one or more second policies; and wherein the instruction message is transmitted to the identified one or more caches. If it is determined to proactively cache the content then the method may identify the address of one or more caches to instruct to proactively cache the content.
  • the identification of the one or more caches may be based on one or more second policies.
  • the second policies may define when to instruct several caches to cache the content.
  • the second policies may relate to a correlation between the number of requests for the same content and the number of caches instructed to cache the content.
  • Policies may define actions based on the content, content provider, user/subscriber, and so on. As will be appreciated, there may be any number of second policies defining any number of actions, rules, filters and so on in order to be able to identify the caches to instruct to cache the content.
  • the method may further comprise the steps of receiving the one or more second policies; and storing the one or more second policies.
  • the first policies and second policies may be the same policies, different policies or any combination thereof.
  • the method may further comprise the steps of determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the determination as to whether to proactively cache the content in a request may be made. Also, the method may determine whether the content has been proactively cached and therefore redirect the request for the content to a cache that has proactively cached the content.
  • the method may further comprise the steps of receiving a register message from the one or more caches that have cached the content; and updating a register based on the register message.
  • a register may be maintained to indicate the content that the one or more cache has cached.
  • Each time a cache caches content the cache may register the content via a register message so that it may be determined if the content in a request for content has been proactively cached beforehand.
  • the method may further comprise storing a list of caches.
  • the list of caches may include the address of the caches.
  • the list of caches may comprise the location in the network and/or the real world.
  • an apparatus comprising: a first processor adapted to monitor one or more requests for content; a second processor adapted to determine whether to proactively cache the content based on one or more first policies; and a first output adapted to transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
  • an apparatus adapted to monitor one or more requests for content; determine whether to proactively cache the content based on one or more first policies; and transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
  • the apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
  • the first processor is further adapted to analyse each request for content to identify content being requested.
  • the apparatus may further comprise a third processor adapted to identify an address of one or more caches based on one or more second policies; and the output is adapted to transmit the instruction message to the identified one or more caches.
  • the apparatus may further comprise a fourth processor adapted to determine whether the content has been proactively cached in a cache; and a second output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the apparatus may further comprise an input adapted to receive a register message from the one or more caches that have cached the content; and a fifth processor adapted to update a register based on the register message.
  • the apparatus may be adapted by hardware, software or any combination thereof.
  • the apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
  • the first processor through fifth processor may be the same processor, different processors, or any combination thereof.
  • the first output through third output may be the same output, different outputs or any combination thereof.
  • a computer program product comprising computer readable executable code for: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies; and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
  • the computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
  • a fifth aspect of the present invention there is provided a method comprising the steps of: receiving a request for content; determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the method may be implemented by a computing device, a server that may be a separate entity.
  • the method may be implemented as a function or module either separate to or integrated with another system.
  • the requests for content may be for any content (e.g. web page, information, applications, resources, data, subscriptions, and so on) in a network, such as the Internet.
  • content e.g. web page, information, applications, resources, data, subscriptions, and so on
  • a network such as the Internet.
  • the requests for content may be from user equipment, for example, mobile devices (e.g. mobile phones, tablets, laptops, etc), from network operator servers, other party servers or equipment and so on.
  • mobile devices e.g. mobile phones, tablets, laptops, etc
  • network operator servers e.g. mobile phones, tablets, laptops, etc
  • the step of determining whether the content has been proactively cached in a cache further comprises the step of checking a register, wherein the register maintains an indication of content that has been cached.
  • the method may further comprise the steps of receiving a register message from the cache that have cached the content, wherein the register message includes an
  • the method may further comprise the steps of receiving an unregister message from the cache that have cached the content, wherein the unregister message includes an identification of the cache and of content that the cache has removed from the cache; and updating the register based on the register message.
  • the step of redirecting the request for content to the cache may further comprise identifying an address of the cache that has cached the content and redirecting the request for content to the identified address.
  • policies may be applied to determine or identify one of the caches that have proactively cached the content.
  • the policy may be a simple round robin, may be to determine the closest or most local cache to the requestor, may be to determine the least loaded cache, may be to determine the least loaded route to a cache, and so on.
  • the policies may define any actions, rules, filters or conditions to enable an efficient and effective selection of a cache and to identify the address of the cache to which the request for content is to be redirected.
  • the method may further comprise the steps of monitoring the request for content
  • an apparatus comprising: a first input adapted to receive a request for content; a first processor adapted to determine whether the content has been proactively cached in a cache; and a first output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • an apparatus adapted to: receive a request for content; determine whether the content has been proactively cached in a cache; and redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
  • entity e.g. network element
  • network management system e.g. network management system
  • the first processor may be further adapted to check a register, wherein the register maintains an indication of content that has been cached.
  • the apparatus may further comprise a second input adapted to receive a register message from the cache that have cached the content, wherein the register message includes an identification of the cache and the content; and a second processor adapted to update the register based on the register message.
  • the apparatus may further comprise a third input adapted to receive an unregister message from the cache that have cached the content, wherein the unregister message includes an identification of the cache and of content that the cache has removed from the cache; and a third processor adapted to update the register based on the register message.
  • the apparatus may further comprise a fourth processor adapted to identify an address of the cache that has cached the content and the first output is further adapted to redirect the request for content to the identified address.
  • the apparatus may further comprise a fifth processor adapted to monitor the request for content; a sixth processor adapted to determine whether to proactively cache the content based on one or more policies; and a second output adapted to transmit an instruction message to one or more of the caches to cache the content if the determination to proactively cache the content is positive.
  • the apparatus may be adapted by hardware, software or any combination thereof.
  • the apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
  • the first processor through sixth processor may be the same processor, different processors, or any combination thereof.
  • the first input through third input may be the same input, different inputs or any combination thereof.
  • a computer program product comprising computer readable executable code for: receiving a request for content; determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
  • a ninth aspect of the present invention there is provided a method comprising the steps of: receiving a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmitting a request for the content based on the first instruction message; receiving the requested content; and transmitting a register message to a register function so as to register the proactively cached content with the register.
  • the method may be implemented by a computing device, a server that may be a separate entity.
  • the method may be implemented as a function or module either separate to or integrated with another system.
  • the method may be implemented in a cache as a cache client or may be implemented as a controller of one or more caches.
  • the request for the content is transmitted to a content provider or a cache that has cached the content.
  • the method may further comprise the step of transmitting a second instruction message to one or more cache to instruct the one or more cache to proactively cache the content.
  • the second instruction message may instruct one or more other caches to cache the identified content so that popular content may be cached by many caches thereby enhancing the delivery of content to the requestor of the content.
  • the second message may be sent to all other known caches, may be sent to all local caches (either or both in the real world and in the network), and so on.
  • the decision to instruct other caches to cache the content may be dependent on the number of requests for the content, the area in which the requests originate, and so on. Any number of policies may be defined to enable the selection or identification of the other caches to transmit the second d instruction message to.
  • an apparatus comprising: a first input adapted to receive a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; a first output adapted to transmit a request for the content based on the first instruction message; a second input adapted to receive the requested content; and a second output adapted to transmit a register message to a register function so as to register the proactively cached content with the register.
  • an apparatus adapted to receive a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmit a request for the content based on the first instruction message; receive the requested content; and transmit a register message to a register function so as to register the proactively cached content with the register.
  • the apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
  • the apparatus may be a cache or an entity operatively connected to one or more cache.
  • the first output may be further adapted to transmit the request for the content to a content provider or a cache that has cached the content.
  • the apparatus may further comprise a third output adapted to transmit a second instruction message to one or more cache to instruct the one or more cache to proactively cache the content.
  • the apparatus may be adapted by hardware, software or any combination thereof.
  • the apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
  • the first input and second input may be the same input or different inputs.
  • the first output through third output may be the same output, different outputs or any combination thereof.
  • a computer program product comprising computer readable executable code for: receiving a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmitting a request for the content based on the first instruction message; receiving the requested content; and transmitting a register message to a register function so as to register the proactively cached content with the register.
  • the computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
  • a thirteenth aspect of the present invention there is provided a method comprising the steps of: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then determining whether the content has been proactively cached in a cache and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • an apparatus comprising: a first processor adapted to monitor one or more requests for content; a second processor adapted to determine whether to proactively cache the content based on one or more first policies and a first output adapted to transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then the apparatus further comprises: a third processor adapted to determine whether the content has been proactively cached in a cache and a second output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • an apparatus adapted to monitor one or more requests for content; determine whether to proactively cache the content based on one or more first policies and transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then determine whether the content has been proactively cached in a cache and redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
  • the apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
  • the apparatus may be adapted by hardware, software or any combination thereof.
  • the apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
  • the first output and second output may be the same output or different outputs.
  • the first processor through third processor may be the same processor, different processors or any combination thereof.
  • a computer program product comprising computer readable executable code for: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive;
  • the computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
  • Figure 1 shows a simplified block diagram of a system according to many of the embodiments of the present invention.
  • FIG. 2 shows a simplified message flow diagram according to many of the embodiments of the present invention.
  • the current trend of end user usage models and applications are to move towards information centric networks where the location of the information, e.g. content, and the routing of information is less and less relevant which is in conflict with the traditional end- to-end network architecture.
  • PCM Proactive Caching Mechanism
  • ICN Information Centric Network
  • PCM can be implemented in current network architectures, e.g. the Internet, to enable the benefits of ICNs on current network architectures as well as provide an architecture for future network implementations.
  • Figure 1 shows a simplified block diagram of a system 101 in accordance with many of the embodiments of the present invention.
  • UE User Equipment
  • NO Network Operator
  • the U E 102 may be operatively connected to the one or more cache 106 and the one or more content providers 107, directly, via the NO 103 or any combination thereof.
  • the NO 103 is also operatively connected to the one or more cache 106 and the one or more content providers 107.
  • the NO 103 may include several functions and systems to enable the NO 103 to operate their network and provide services to the users 108 of the UE 102, e.g. infrastructure, network management systems, billing systems, and so on.
  • the NO 102 may also include a Monitor 104 and a Register 105 to implement at least in part the PCM of many of the embodiments of the present invention.
  • the Monitor 104 and Register 105 may be separate entities, for example, on computing devices such as a server.
  • the Monitor 104 and the Register 105 may be modules that are part of the NO's 103 systems and may be a combined Monitor and Register.
  • the one or more cache 106 may be operated by the NO 103 as part of the NO's 102 network or may be separate to the NO's 103 network. Each cache 106 may include its own Monitor 104 and Register 105, any number of cache 106 may share a Monitor 104 and/or Register 105, or any combination thereof. The Monitor 104 and/or Register 105 may be co-located with one or more cache 106, may be separate to the cache 106 and therefore operatively connected thereto, or any combination thereof.
  • the cache 106 may also include a Cache Client (CC) 109 where the CC 109 controls the functionality of the cache 106 in the PCM.
  • the CC 109 may alternatively be shared between one or more cache 106, be operatively connected to the cache 106 and may be part of the NO's 103 network.
  • the content provider 107 provides content for user's 108 of UE 102.
  • the content may be provided in response to a specific request for content from a UE 102 for the content or may be provided in response to a user 108 subscribing to the content from the content provider 107.
  • the PCM of many of the embodiments of the present invention monitors, via the Monitor 104, the requests from UE 102 for content.
  • Implementation of the monitor 104 to monitor the requests from U Es 102 may be dependent on the underlying network architecture in use. If the network architecture is based on the information centric paradigm then the monitor 104 may monitor the request from the UE 102 for content. If, however, the network architecture in use is not based on the information centric paradigm, such as the current architecture of the Internet, then the monitor 104 may exploit or utilise mechanisms such as Deep Packet Inspection (DPI), a Uniform Resource Locator (URL) parser mechanism in order to monitor the requests from UEs 102 to identify the content. Based on the monitored requests the monitor 104 may determine whether to proactively cache content that has been requested by users 108 via UEs 102.
  • DPI Deep Packet Inspection
  • URL Uniform Resource Locator
  • the determination of the content to proactively cache may be based on one or more policies (which may include rules, filters; conditions; and so on) that are predefined in the monitor 104.
  • the policies may be dynamic and change over time or may be defined for particular times or for particular network conditions.
  • policies may be used to specify various conditions or rules which the content, requests for content or the determination of the content to proactively cache should comply with. For example, policies may define the number of content requests that have to be monitored by the Monitor 104 before the content is determined to be proactively cached. Policies may define actions based on, for example, on different identities of the parties such as the identity of a content provider or the identity of a subscriber (user requesting the content). Policies may be a combination of whitelists and blacklists, in other words, lists explicitly defining what to cache and/or what not to cache.
  • the policy may also define which content requests or subscriptions for content are, or are not, to be monitored by the monitor 104.
  • Content providers 107 may also wish that their content is excluded from being proactively cached and distributed in this manner and so the content providers 107 may mark their published content to be included or excluded by the Monitor 104.
  • the policies may further define actions to be taken when determining whether to proactively cache content, based on, for example, the user 108, the content provider 107, and so on.
  • the ability to, via the predefined policies, identify content requests and apply different actions to the content to be cached further enhances the PCM of many of the embodiments.
  • the proactive caching may be utilised to support different business relationships by, for instance, prioritising data belonging to a specific customer.
  • different service classes or service level agreements may be implemented such that if content is categorised as a "gold" category content then the system may guarantee that the content will be proactively cached (where space may be made in the cache by removing lower category content if space is required).
  • policies may be predefined an stored in the monitor 104 where the policies may be predefined with any condition or rule that any entity (NO103, UE 102, Content Provider 107, etc) in the system may wish to be applied.
  • the policies may be dynamically defined and modified in the Monitor 104.
  • the determination to proactively cache content is made based on the content requests being monitored. If the Monitor 104 determines that the content should be proactively cached then Monitor
  • the 104 may transmit an instruction to one or more caches 106 to proactively cache the content determined to be cached.
  • the monitor 104 may instruct a single cache 106 or multiple caches 106 based on one or more factors or policies. For example, if the NO's 103 network is substantial in size of infrastructure then a substantial number of caches 106 may be implemented.
  • the one or more factors may further include the geographical concentration of requests so that caches 106 in that geographical area may be instructed to proactively cache the content that the monitor 104 determined should be proactively cached.
  • any number of factors may be taken into account in order to determine which of the caches 106 in the network (either operated by the NO 103 or operated by a third party) are to be instructed to cache the content determined to be proactively cached in many of the embodiments.
  • the content to be proactively cache may be defined in the instruction to the caches 106, for example, as a parameter in the instruction.
  • the instruction transmitted to the one or more caches 106 may include further parameters, for example, an address or addresses from which the cache may obtain the content from (e.g. the address of the content provider, the address of other caches 106 that may already have cached the content, and so on).
  • the instruction transmitted to the cache 106 may include any number of parameters necessary to instruct the cache to be able to obtain and proactively cache the required content.
  • the cache 106 receives the instruction from the Monitor 104 to proactively cache the content.
  • the cache 106 then obtains, e.g. requests and receives, the defined content.
  • the cache 106 may obtain the required content from the content provider 107 and/or from another cache 106 in the network.
  • the cache 106 on obtaining the required content may then be considered a provider or publisher of the content.
  • the cache 106 registers the proactively cached content with the Register 105 at the NO 103.
  • any further requests for the content may be redirected to a cache 106 that is local to the UE 102 making the request for the content rather than to the original content provider 107. Therefore, the embodiments substantially improve the scalability, resource usage and performance of the network within which the PCM is implemented as a reduction in bottlenecks or substantial traffic is prevented as the popular content can be obtained from one or more caches that proactively cached the content based on the instructions received from the monitor 104.
  • a user via User Equipment (UE) 202 connects to Network Operator (NO) 203 to which the UE 202 is subscribed.
  • a request for content 208 is transmitted from the UE 202 to the NO 203.
  • the NO 203 includes a Monitor 204, a Register 205 and a cache 206.
  • the request for content is monitored by the monitor 204 and is received by the register 205.
  • the register determines whether the requested content has been proactively cached or is not in the cache 206.
  • the register 205 is informed by each cache 206 each time the cache 206 proactively caches some content so that the register 206 maintains an up-to- date record or directory of the content that is proactively cached in the caches 206.
  • the register 205 may direct the request for content to the cache 206.
  • the requested content has not previously been proactively cached in the cache 206.
  • the register 205 forwards the request 209 for content to the content provider 207.
  • the content provider 207 responds with the requested content 210.
  • the monitor 204 that has been monitoring the requests from all U Es 202 associated with the NO 203, determines 21 1 that the content being requested by UE 202 should be proactively cached in cache 206. In this example, monitor 204 determines 21 1 that this is the nth request for the content where n defines the number of requests that have to be made before the content is proactively cached and n is given in a predefined policy. As described hereinabove the monitor 204 may apply any number of predefined policies and/or filters in order to determine the content to proactively cache.
  • monitor 204 determines 21 1 that the requested content is to be proactively cached in cache 206.
  • Monitor 204 transmits an instruction 212 to cache 206 to proactively cache the defined content.
  • cache 206 transmits a request 213 to the content provider 207 and, in response, the cache 206 receives the requested content 214 which cache 206 caches.
  • Cache 206 may then inform the register 205 of the new content that cache 206 has proactively cached. In this regard, cache 206 transmits a register message 215 to the register 205 to register the proactively cached content.
  • a request is received by the NO 203 for the content.
  • the register 205 checks its maintained register and determines or identifies that the content has been cached in cache 206. Thus, the register 205 determines 217 that the request for content can be transmitted to the cache 206. Accordingly, the request for content is transmitted or forwarded 218 onto the cache 206 which in response to receiving the request for content responds 219 to the UE 202 with the requested content.
  • the PCM is generalised in order to show; explain and reference the invention embodied by the embodiments.
  • the monitor, register, and/or cache may be referred to, or implemented, by different entities.
  • one information centric approach is known as a rendezvous architecture approach that implements a Publish/Subscribe Internet Routing Paradigm (PSIRP).
  • PSIRP Publish/Subscribe Internet Routing Paradigm
  • the monitor functionality and the register functionality may be implemented by, or integrated in, a rendezvous node.
  • the PSIRP rendezvous architecture may be used as overlay on top of various routing and forwarding architectures.
  • Another example of information centric architecture, where proactive caching may be implemented is the Content Centric Networking (CCN) architecture.
  • CCN Content Centric Networking
  • the monitoring function would monitor content interest packets and in case of caching decision, it would command the cache to send its own interest packet(s) for the same content. After receiving the content, the cache would advertise the content to the CCN node(s), e.g. the register in the above embodiments, which thereafter could redirect cached content interest packets to the cache.
  • the proactive caching mechanism is in contrast to traditional caching that is carried out presently in electronic devices and on networks.
  • the proactive caching mechanism does not require the caches to be on the content delivery path, as required by traditional caching techniques.
  • the monitoring function e.g. the monitor, monitors all the content requests, but the returning content from the content provider to the user equipment requesting the content does not need to go through the monitoring function or the cache.
  • each of the caches may respond to instructions from the monitoring function(s) as needed.
  • the monitoring function could for example instruct caches that do not yet have the particular content to fetch the content from those caches that already have it. This lowers the transit costs and enables to load balance the traffic patterns within a network (e.g. avoid congested links for the content downloading).
  • Another benefit is that once the content is proactively cached, and therefore stored in a cache, the content can be replicated to a number of other caches by registering each new instance of the content with the register. Thus, the cached content may efficiently and effectively be stored closer to the demand.
  • a network operator may save in communication costs, because the content may be provided by a cache that is inside the network operator's network. In other words, by effectively and proactively bringing a content source closer to the demand for the content the network operator may receive efficiencies in cost, resource usage, latency and performance.
  • the proactive caching mechanism may be used as additional functionality for the existing networks, such as the Internet, and in parallel with the existing caching mechanisms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present invention relates to methods and apparatus to cache content in a network. A monitor 104 monitors one or more requests for content, where the requests for content may be from user equipment 102. The monitor 104 determines whether to proactively cache said content based on one or more first policies and transmits an instruction message to one or more caches 106 to cache said content if said determination to proactively cache said content is positive.

Description

Description
Title Caching Content
The present invention relates to caching content and, in particular, to proactive caching of content in network architectures. Content distribution and delivery is one of the main uses and purposes of many network architectures, in particular, the Internet. Presently, most network architectures, including the Internet, are based on an "End-to-End" Communication paradigm where
communication is between two fixed, or near fixed points in the network. For example, a first fixed point of a user device (e.g. personal computer) requests content from a second fixed point of a server and receives the requested content from the server by a particular route or path through the network.
However, in more modern scenarios the location of the content is not as important, especially in relation to the so-called cloud computing paradigm, where the routing of requests for content is based on the content rather than on the fixed end point where the content is located. Therefore, presently the end user usage model and applications are moving towards information centric and the so-called publish/subscribe paradigms.
As such, there is developing a conflict between the application developers and/or usage models for networking and the current network architectures.
The present invention seeks to address, at least in part, some or all of the drawbacks and disadvantages described above. Furthermore, the present invention seeks to address, at least in part, the need to enable the implementation of information centric and/or publish/subscribe paradigms on existing network architectures that are based on end-to- end communication mechanisms.
According to a first aspect of the present invention there is provided a method comprising the steps of: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies; and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
The method may be implemented by a computing device, a server that may be a separate entity. The method may be implemented as a function or module either separate to or integrated with another system.
The requests for content, where the content may be any content (e.g. web page, information, applications, resources, data, subscriptions, and so on) in a network, such as the Internet, are monitored.
The requests for content may be from user equipment, for example, mobile devices (e.g. mobile phones, tablets, laptops, etc), from network operator servers, other party servers or equipment and so on. Based on one or more policies it is determined whether to proactively cache the content in the request that is monitored. Proactively caching relates to making a proactive decision to cache content. Thus, instead of a cache having to be located on the delivery path and being reactive by taking content that is being delivered to an end user, the method decides whether to cache content based on the request for content and then instructs a cache to proactively seek and obtain the content.
One or more policies, where policies may include filters, rules, conditions, and so on, may be applied in order to determine whether to proactively cache the content. For example, the policy may be to cache everything. Policies may be to cache everything from one content provider but not from another content provider. Policies may be to cache content once there have been a predefined number of requests for the content. Policies may relate to content providers (e.g. stating which content providers wish their content to be cached, where or by whom the content may be cached etc), relate to network operators or relate to users/subscribers (e.g. the identity of the user). Policies may include whitelists and/or blacklists in relation to content, content providers, users/subscribers and so on. The policies may also define actions to be taken if certain content is to be cached, the request comes from a certain user/subscriber, and so on.
As will be appreciated any number of policies may be defined and applied to determine whether or not to proactively cache the content. Also the policies may define any description of when, how or why or what to cache. If it is determined to proactively cache the content then an instruction message may be transmitted to one or more caches. The instruction message may indicate the content to cache. The instruction message may indicate the location (e.g. address) of the content, for example, the location may be of the content provider or of another cache or a combination thereof. The instruction message may be sent to more than one cache to ensure that more than one caches cache the content to enable efficient and local delivery of the content to users.
The step of monitoring one or more requests for content may further comprise the step of analysing each request for content to identify content being requested. In Information Centric networks the method may monitor the requests by, for example, interpreting the request protocol to detect Identifiers of the content. In more traditional networks the monitoring may be implemented using Deep Packet Inspection or via a Uniform Resource Locator (URL) parser mechanism to identify the content. The method may further comprise the steps of receiving the one or more first policies; and storing the one or more first policies. Any party may be able to define and store a policy to be applied when determining whether to proactively cache the content. The policies that are defined may be stored and maintained.
The policies may be updated or changed by any party (e.g. content provider, network operator, user/subscriber, etc). Therefore, the policies may be adapted, changed or altered depending on the needs or wishes of a party. The policies may be updated or changed depending on the network conditions, e.g. if the network is heavily loaded then more content may be cached to distribute the load more efficiently and effectively.
Accordingly, the method may further comprise receiving an updated policy and storing the updated policy.
The method may further comprise the steps of identifying an address of one or more caches based on one or more second policies; and wherein the instruction message is transmitted to the identified one or more caches. If it is determined to proactively cache the content then the method may identify the address of one or more caches to instruct to proactively cache the content. The identification of the one or more caches may be based on one or more second policies. For example, the second policies may define when to instruct several caches to cache the content. The second policies may relate to a correlation between the number of requests for the same content and the number of caches instructed to cache the content. Policies may define actions based on the content, content provider, user/subscriber, and so on. As will be appreciated, there may be any number of second policies defining any number of actions, rules, filters and so on in order to be able to identify the caches to instruct to cache the content.
The method may further comprise the steps of receiving the one or more second policies; and storing the one or more second policies.
The first policies and second policies may be the same policies, different policies or any combination thereof.
The method may further comprise the steps of determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
Based on the request for content the determination as to whether to proactively cache the content in a request may be made. Also, the method may determine whether the content has been proactively cached and therefore redirect the request for the content to a cache that has proactively cached the content.
The method may further comprise the steps of receiving a register message from the one or more caches that have cached the content; and updating a register based on the register message. A register may be maintained to indicate the content that the one or more cache has cached. Each time a cache caches content the cache may register the content via a register message so that it may be determined if the content in a request for content has been proactively cached beforehand. The method may further comprise storing a list of caches. The list of caches may include the address of the caches. The list of caches may comprise the location in the network and/or the real world. According to a second aspect of the present invention there is provided an apparatus comprising: a first processor adapted to monitor one or more requests for content; a second processor adapted to determine whether to proactively cache the content based on one or more first policies; and a first output adapted to transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive.
According to a third aspect of the present invention there is provided an apparatus adapted to monitor one or more requests for content; determine whether to proactively cache the content based on one or more first policies; and transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive. The apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
The first processor is further adapted to analyse each request for content to identify content being requested.
The apparatus may further comprise a third processor adapted to identify an address of one or more caches based on one or more second policies; and the output is adapted to transmit the instruction message to the identified one or more caches.
The apparatus may further comprise a fourth processor adapted to determine whether the content has been proactively cached in a cache; and a second output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
The apparatus may further comprise an input adapted to receive a register message from the one or more caches that have cached the content; and a fifth processor adapted to update a register based on the register message. The apparatus may be adapted by hardware, software or any combination thereof. The apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
The first processor through fifth processor may be the same processor, different processors, or any combination thereof. The first output through third output may be the same output, different outputs or any combination thereof.
According to a fourth aspect of the present invention there is provided a computer program product comprising computer readable executable code for: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies; and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive. The computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention. According to a fifth aspect of the present invention there is provided a method comprising the steps of: receiving a request for content; determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive. The method may be implemented by a computing device, a server that may be a separate entity. The method may be implemented as a function or module either separate to or integrated with another system.
The requests for content may be for any content (e.g. web page, information, applications, resources, data, subscriptions, and so on) in a network, such as the Internet.
The requests for content may be from user equipment, for example, mobile devices (e.g. mobile phones, tablets, laptops, etc), from network operator servers, other party servers or equipment and so on.
A determination is made as to whether the content in the request has been proactively cached and the request for content may be redirected to a cache if it is determined that the content has been proactively cached in the cache. The step of determining whether the content has been proactively cached in a cache further comprises the step of checking a register, wherein the register maintains an indication of content that has been cached.
The method may further comprise the steps of receiving a register message from the cache that have cached the content, wherein the register message includes an
identification of the cache and the content; and updating the register based on the register message.
The method may further comprise the steps of receiving an unregister message from the cache that have cached the content, wherein the unregister message includes an identification of the cache and of content that the cache has removed from the cache; and updating the register based on the register message.
The step of redirecting the request for content to the cache may further comprise identifying an address of the cache that has cached the content and redirecting the request for content to the identified address.
There may be more than one cache that has proactively cached the content. One or more policies may be applied to determine or identify one of the caches that have proactively cached the content. For example, the policy may be a simple round robin, may be to determine the closest or most local cache to the requestor, may be to determine the least loaded cache, may be to determine the least loaded route to a cache, and so on. As will be appreciated any number of policies may be defined where the policies may define any actions, rules, filters or conditions to enable an efficient and effective selection of a cache and to identify the address of the cache to which the request for content is to be redirected.
The method may further comprise the steps of monitoring the request for content;
determining whether to proactively cache the content based on one or more policies; and transmitting an instruction message to one or more of the caches to cache the content if the determination to proactively cache the content is positive.
According to a sixth aspect of the present invention there is provided an apparatus comprising: a first input adapted to receive a request for content; a first processor adapted to determine whether the content has been proactively cached in a cache; and a first output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
According to a seventh aspect of the present invention there is provided an apparatus adapted to: receive a request for content; determine whether the content has been proactively cached in a cache; and redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
The apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
The first processor may be further adapted to check a register, wherein the register maintains an indication of content that has been cached.
The apparatus may further comprise a second input adapted to receive a register message from the cache that have cached the content, wherein the register message includes an identification of the cache and the content; and a second processor adapted to update the register based on the register message.
The apparatus may further comprise a third input adapted to receive an unregister message from the cache that have cached the content, wherein the unregister message includes an identification of the cache and of content that the cache has removed from the cache; and a third processor adapted to update the register based on the register message. The apparatus may further comprise a fourth processor adapted to identify an address of the cache that has cached the content and the first output is further adapted to redirect the request for content to the identified address.
The apparatus may further comprise a fifth processor adapted to monitor the request for content; a sixth processor adapted to determine whether to proactively cache the content based on one or more policies; and a second output adapted to transmit an instruction message to one or more of the caches to cache the content if the determination to proactively cache the content is positive.
The apparatus may be adapted by hardware, software or any combination thereof. The apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
The first processor through sixth processor may be the same processor, different processors, or any combination thereof. The first input through third input may be the same input, different inputs or any combination thereof.
According to an eighth aspect of the present invention there is provided a computer program product comprising computer readable executable code for: receiving a request for content; determining whether the content has been proactively cached in a cache; and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
The computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
According to a ninth aspect of the present invention there is provided a method comprising the steps of: receiving a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmitting a request for the content based on the first instruction message; receiving the requested content; and transmitting a register message to a register function so as to register the proactively cached content with the register.
The method may be implemented by a computing device, a server that may be a separate entity. The method may be implemented as a function or module either separate to or integrated with another system. The method may be implemented in a cache as a cache client or may be implemented as a controller of one or more caches.
The request for the content is transmitted to a content provider or a cache that has cached the content.
The method may further comprise the step of transmitting a second instruction message to one or more cache to instruct the one or more cache to proactively cache the content.
The second instruction message may instruct one or more other caches to cache the identified content so that popular content may be cached by many caches thereby enhancing the delivery of content to the requestor of the content. The second message may be sent to all other known caches, may be sent to all local caches (either or both in the real world and in the network), and so on. The decision to instruct other caches to cache the content may be dependent on the number of requests for the content, the area in which the requests originate, and so on. Any number of policies may be defined to enable the selection or identification of the other caches to transmit the second d instruction message to. According to a tenth aspect of the present invention there is provided an apparatus comprising: a first input adapted to receive a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; a first output adapted to transmit a request for the content based on the first instruction message; a second input adapted to receive the requested content; and a second output adapted to transmit a register message to a register function so as to register the proactively cached content with the register. According to an eleventh aspect of the present invention there is provided an apparatus adapted to receive a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmit a request for the content based on the first instruction message; receive the requested content; and transmit a register message to a register function so as to register the proactively cached content with the register.
The apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system). The apparatus may be a cache or an entity operatively connected to one or more cache.
The first output may be further adapted to transmit the request for the content to a content provider or a cache that has cached the content. The apparatus may further comprise a third output adapted to transmit a second instruction message to one or more cache to instruct the one or more cache to proactively cache the content.
The apparatus may be adapted by hardware, software or any combination thereof. The apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
The first input and second input may be the same input or different inputs. The first output through third output may be the same output, different outputs or any combination thereof.
According to a twelfth aspect of the present invention there is provided a computer program product comprising computer readable executable code for: receiving a first instruction message from a monitoring function wherein the first instruction message includes an instruction to proactively cache content; transmitting a request for the content based on the first instruction message; receiving the requested content; and transmitting a register message to a register function so as to register the proactively cached content with the register.
The computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention. According a thirteenth aspect of the present invention there is provided a method comprising the steps of: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then determining whether the content has been proactively cached in a cache and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive. According to a fourteenth aspect of the present invention there is provided an apparatus comprising: a first processor adapted to monitor one or more requests for content; a second processor adapted to determine whether to proactively cache the content based on one or more first policies and a first output adapted to transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then the apparatus further comprises: a third processor adapted to determine whether the content has been proactively cached in a cache and a second output adapted to redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive.
According to a fifteenth aspect of the present invention there is provided an apparatus adapted to monitor one or more requests for content; determine whether to proactively cache the content based on one or more first policies and transmit an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive; wherein if the determination to proactively cache the content is negative then determine whether the content has been proactively cached in a cache and redirect the request for content to the cache if the determination that the content has been proactively cached in a cache is positive. The apparatus may be a computing device, a server, a module, or unit that is separate to another entity (e.g. network element) or is operatively connected to or integrated with another entity (e.g. network management system).
The apparatus may be adapted by hardware, software or any combination thereof. The apparatus may comprise any means, inputs, outputs, memory, processors, etc to implement any or all of the functions in accordance with the aspects of the invention.
The first output and second output may be the same output or different outputs. The first processor through third processor may be the same processor, different processors or any combination thereof.
According to a sixteenth aspect of the present invention there is provided a computer program product comprising computer readable executable code for: monitoring one or more requests for content; determining whether to proactively cache the content based on one or more first policies and transmitting an instruction message to one or more caches to cache the content if the determination to proactively cache the content is positive;
wherein if the determination to proactively cache the content is negative then determining whether the content has been proactively cached in a cache and redirecting the request for content to the cache if the determination that the content has been proactively cached in a cache is positive. The computer program product may further comprise computer readable executable code for performing any or all of the functions in accordance with the aspects of the invention.
Embodiments of the present invention will now be described, by way of example only, and with reference to the accompanying drawings in which:
Figure 1 shows a simplified block diagram of a system according to many of the embodiments of the present invention.
Figure 2 shows a simplified message flow diagram according to many of the embodiments of the present invention.
As discussed hereinabove, current networks, in particular, the Internet is decades old and based on the original end-to-end communication paradigm and architecture. However, with the substantial increase in network traffic and content requests the traditional network architecture will become inefficient and incur resource problems.
The current trend of end user usage models and applications are to move towards information centric networks where the location of the information, e.g. content, and the routing of information is less and less relevant which is in conflict with the traditional end- to-end network architecture.
Thus, many of the embodiments relate to a Proactive Caching Mechanism (PCM) for Information Centric Network (ICN) to enable an enhanced and more efficient scalability, resource usage and performance. Moreover, the PCM can be implemented in current network architectures, e.g. the Internet, to enable the benefits of ICNs on current network architectures as well as provide an architecture for future network implementations.
Figure 1 shows a simplified block diagram of a system 101 in accordance with many of the embodiments of the present invention.
User Equipment (UE) 102 is operatively connected to a Network Operator (NO) 103, one or more cache 106, and one or more content providers 107. The U E 102 may be operatively connected to the one or more cache 106 and the one or more content providers 107, directly, via the NO 103 or any combination thereof.
The NO 103 is also operatively connected to the one or more cache 106 and the one or more content providers 107. The NO 103 may include several functions and systems to enable the NO 103 to operate their network and provide services to the users 108 of the UE 102, e.g. infrastructure, network management systems, billing systems, and so on. In many of the embodiments the NO 102 may also include a Monitor 104 and a Register 105 to implement at least in part the PCM of many of the embodiments of the present invention. The Monitor 104 and Register 105 may be separate entities, for example, on computing devices such as a server. The Monitor 104 and the Register 105 may be modules that are part of the NO's 103 systems and may be a combined Monitor and Register.
The one or more cache 106 may be operated by the NO 103 as part of the NO's 102 network or may be separate to the NO's 103 network. Each cache 106 may include its own Monitor 104 and Register 105, any number of cache 106 may share a Monitor 104 and/or Register 105, or any combination thereof. The Monitor 104 and/or Register 105 may be co-located with one or more cache 106, may be separate to the cache 106 and therefore operatively connected thereto, or any combination thereof.
The cache 106 may also include a Cache Client (CC) 109 where the CC 109 controls the functionality of the cache 106 in the PCM. The CC 109 may alternatively be shared between one or more cache 106, be operatively connected to the cache 106 and may be part of the NO's 103 network.
The content provider 107 provides content for user's 108 of UE 102. The content may be provided in response to a specific request for content from a UE 102 for the content or may be provided in response to a user 108 subscribing to the content from the content provider 107.
The PCM of many of the embodiments of the present invention monitors, via the Monitor 104, the requests from UE 102 for content.
Implementation of the monitor 104 to monitor the requests from U Es 102 may be dependent on the underlying network architecture in use. If the network architecture is based on the information centric paradigm then the monitor 104 may monitor the request from the UE 102 for content. If, however, the network architecture in use is not based on the information centric paradigm, such as the current architecture of the Internet, then the monitor 104 may exploit or utilise mechanisms such as Deep Packet Inspection (DPI), a Uniform Resource Locator (URL) parser mechanism in order to monitor the requests from UEs 102 to identify the content. Based on the monitored requests the monitor 104 may determine whether to proactively cache content that has been requested by users 108 via UEs 102. The determination of the content to proactively cache may be based on one or more policies (which may include rules, filters; conditions; and so on) that are predefined in the monitor 104. The policies may be dynamic and change over time or may be defined for particular times or for particular network conditions.
The policies may be used to specify various conditions or rules which the content, requests for content or the determination of the content to proactively cache should comply with. For example, policies may define the number of content requests that have to be monitored by the Monitor 104 before the content is determined to be proactively cached. Policies may define actions based on, for example, on different identities of the parties such as the identity of a content provider or the identity of a subscriber (user requesting the content). Policies may be a combination of whitelists and blacklists, in other words, lists explicitly defining what to cache and/or what not to cache.
The policy may also define which content requests or subscriptions for content are, or are not, to be monitored by the monitor 104. Content providers 107 may also wish that their content is excluded from being proactively cached and distributed in this manner and so the content providers 107 may mark their published content to be included or excluded by the Monitor 104. The policies may further define actions to be taken when determining whether to proactively cache content, based on, for example, the user 108, the content provider 107, and so on. The ability to, via the predefined policies, identify content requests and apply different actions to the content to be cached further enhances the PCM of many of the embodiments. For example, the proactive caching may be utilised to support different business relationships by, for instance, prioritising data belonging to a specific customer. In a further example, different service classes or service level agreements may be implemented such that if content is categorised as a "gold" category content then the system may guarantee that the content will be proactively cached (where space may be made in the cache by removing lower category content if space is required).
As will be appreciated any number of policies may be predefined an stored in the monitor 104 where the policies may be predefined with any condition or rule that any entity (NO103, UE 102, Content Provider 107, etc) in the system may wish to be applied. The policies may be dynamically defined and modified in the Monitor 104.
Thus, the determination to proactively cache content is made based on the content requests being monitored. If the Monitor 104 determines that the content should be proactively cached then Monitor
104 may transmit an instruction to one or more caches 106 to proactively cache the content determined to be cached. The monitor 104 may instruct a single cache 106 or multiple caches 106 based on one or more factors or policies. For example, if the NO's 103 network is substantial in size of infrastructure then a substantial number of caches 106 may be implemented. The one or more factors may further include the geographical concentration of requests so that caches 106 in that geographical area may be instructed to proactively cache the content that the monitor 104 determined should be proactively cached. As will be appreciated, any number of factors may be taken into account in order to determine which of the caches 106 in the network (either operated by the NO 103 or operated by a third party) are to be instructed to cache the content determined to be proactively cached in many of the embodiments. The content to be proactively cache may be defined in the instruction to the caches 106, for example, as a parameter in the instruction. The instruction transmitted to the one or more caches 106 may include further parameters, for example, an address or addresses from which the cache may obtain the content from (e.g. the address of the content provider, the address of other caches 106 that may already have cached the content, and so on). As will be appreciated, the instruction transmitted to the cache 106 may include any number of parameters necessary to instruct the cache to be able to obtain and proactively cache the required content.
The cache 106 receives the instruction from the Monitor 104 to proactively cache the content. The cache 106 then obtains, e.g. requests and receives, the defined content. As described hereinabove, the cache 106 may obtain the required content from the content provider 107 and/or from another cache 106 in the network.
The cache 106 on obtaining the required content may then be considered a provider or publisher of the content. The cache 106 registers the proactively cached content with the Register 105 at the NO 103. Thus, any further requests for the content may be redirected to a cache 106 that is local to the UE 102 making the request for the content rather than to the original content provider 107. Therefore, the embodiments substantially improve the scalability, resource usage and performance of the network within which the PCM is implemented as a reduction in bottlenecks or substantial traffic is prevented as the popular content can be obtained from one or more caches that proactively cached the content based on the instructions received from the monitor 104.
An example of PCM will now be described with reference to the simplified message flow diagram 201 shown in Figure 2.
A user via User Equipment (UE) 202 connects to Network Operator (NO) 203 to which the UE 202 is subscribed. A request for content 208 is transmitted from the UE 202 to the NO 203.
The NO 203 includes a Monitor 204, a Register 205 and a cache 206. The request for content is monitored by the monitor 204 and is received by the register 205.
The register determines whether the requested content has been proactively cached or is not in the cache 206. The register 205 is informed by each cache 206 each time the cache 206 proactively caches some content so that the register 206 maintains an up-to- date record or directory of the content that is proactively cached in the caches 206.
Therefore, if the content is cached in the cache then the register 205 may direct the request for content to the cache 206. However, in this example, the requested content has not previously been proactively cached in the cache 206.
Therefore, the register 205 forwards the request 209 for content to the content provider 207. In response to receiving the request for content the content provider 207 responds with the requested content 210.
The monitor 204 that has been monitoring the requests from all U Es 202 associated with the NO 203, determines 21 1 that the content being requested by UE 202 should be proactively cached in cache 206. In this example, monitor 204 determines 21 1 that this is the nth request for the content where n defines the number of requests that have to be made before the content is proactively cached and n is given in a predefined policy. As described hereinabove the monitor 204 may apply any number of predefined policies and/or filters in order to determine the content to proactively cache.
In this example, monitor 204 determines 21 1 that the requested content is to be proactively cached in cache 206. Monitor 204 transmits an instruction 212 to cache 206 to proactively cache the defined content. On receiving the instruction; cache 206 transmits a request 213 to the content provider 207 and, in response, the cache 206 receives the requested content 214 which cache 206 caches.
Cache 206 may then inform the register 205 of the new content that cache 206 has proactively cached. In this regard, cache 206 transmits a register message 215 to the register 205 to register the proactively cached content.
At a future point in time, a request is received by the NO 203 for the content. The register 205 checks its maintained register and determines or identifies that the content has been cached in cache 206. Thus, the register 205 determines 217 that the request for content can be transmitted to the cache 206. Accordingly, the request for content is transmitted or forwarded 218 onto the cache 206 which in response to receiving the request for content responds 219 to the UE 202 with the requested content.
In the above described examples, the PCM is generalised in order to show; explain and reference the invention embodied by the embodiments. When many of the embodiments are applied to existing networks and to ICNs the monitor, register, and/or cache may be referred to, or implemented, by different entities.
For example, one information centric approach is known as a rendezvous architecture approach that implements a Publish/Subscribe Internet Routing Paradigm (PSIRP). In such a network the monitor functionality and the register functionality may be implemented by, or integrated in, a rendezvous node. The PSIRP rendezvous architecture may be used as overlay on top of various routing and forwarding architectures. Another example of information centric architecture, where proactive caching may be implemented is the Content Centric Networking (CCN) architecture. In this case the monitoring function would monitor content interest packets and in case of caching decision, it would command the cache to send its own interest packet(s) for the same content. After receiving the content, the cache would advertise the content to the CCN node(s), e.g. the register in the above embodiments, which thereafter could redirect cached content interest packets to the cache.
Accordingly, many of the embodiments describe a proactive caching mechanism that enhances scalability, resource usage and performance as usage of networks for content delivery and requests is substantially increasing. As will be apparent from the above described embodiments, the proactive caching mechanism is in contrast to traditional caching that is carried out presently in electronic devices and on networks. In particular, the proactive caching mechanism does not require the caches to be on the content delivery path, as required by traditional caching techniques. In the proactive caching mechanism the monitoring function, e.g. the monitor, monitors all the content requests, but the returning content from the content provider to the user equipment requesting the content does not need to go through the monitoring function or the cache. Further benefits of the proactive caching mechanism are gained when there are multiple caches distributed in the network where each of the caches may respond to instructions from the monitoring function(s) as needed. The monitoring function could for example instruct caches that do not yet have the particular content to fetch the content from those caches that already have it. This lowers the transit costs and enables to load balance the traffic patterns within a network (e.g. avoid congested links for the content downloading). Another benefit is that once the content is proactively cached, and therefore stored in a cache, the content can be replicated to a number of other caches by registering each new instance of the content with the register. Thus, the cached content may efficiently and effectively be stored closer to the demand.
By proactively caching content that is popular, e.g. being requested a predefined number of times, a network operator may save in communication costs, because the content may be provided by a cache that is inside the network operator's network. In other words, by effectively and proactively bringing a content source closer to the demand for the content the network operator may receive efficiencies in cost, resource usage, latency and performance.
The proactive caching mechanism may be used as additional functionality for the existing networks, such as the Internet, and in parallel with the existing caching mechanisms.
While preferred embodiments of the invention have been shown and described, it will be understood that such embodiments are described by way of example only. Numerous variations, changes and substitutions will occur to those skilled in the art without departing from the scope of the present invention as defined by the appended claims. Accordingly, it is intended that the following claims cover all such variations or equivalents as fall within the spirit and the scope of the invention.

Claims

Claims:
1 . A method comprising the steps of:
monitoring one or more requests for content;
determining whether to proactively cache said content based on one or more first policies; and
transmitting an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive.
2. The method as claimed in claim 1 , in which said step of monitoring one or more requests for content further comprises the step of:
analysing each request for content to identify content being requested.
3. The method as claimed in any one of the preceding claims further comprising the steps of:
identifying an address of one or more caches based on one or more second policies; and
wherein said instruction message is transmitted to said identified one or more caches.
4. The method as claimed in any one of the preceding claims further comprising the steps of:
determining whether said content has been proactively cached in a cache; and redirecting said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
5. The method as claimed in any one of the preceding claims further comprising the steps of:
receiving a register message from said one or more caches that have cached the content and
updating a register based on said register message.
6. An apparatus comprising:
a first processor adapted to monitor one or more requests for content;
a second processor adapted to determine whether to proactively cache said content based on one or more first policies; and
a first output adapted to transmit an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive.
7. The apparatus as claimed in claim 6, in which said first processor is further adapted to analyse each request for content to identify content being requested.
8. The apparatus as claimed in any one of claims 6 or 7 further comprising:
a third processor adapted to identify an address of one or more caches based on one or more second policies; and said output is adapted to transmit said instruction message to said identified one or more caches.
9. The apparatus as claimed in any one of claims 6 to 8 further comprising:
a fourth processor adapted to determine whether said content has been proactively cached in a cache; and
a second output adapted to redirect said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
10. The apparatus as claimed in any one of claims 6 to 9 further comprising:
an input adapted to receive a register message from said one or more caches that have cached the content; and
a fifth processor adapted to update a register based on said register message.
1 1 . A computer program product comprising computer readable executable code for: monitoring one or more requests for content;
determining whether to proactively cache said content based on one or more first policies; and
transmitting an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive.
12. A method comprising the steps of:
receiving a request for content;
determining whether said content has been proactively cached in a cache; and redirecting said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
13. The method as claimed in claim 12 in which said step of determining whether said content has been proactively cached in a cache further comprises the step of:
checking a register, wherein said register maintains an indication of content that has been cached.
14. The method as claimed in claim 13 further comprising the steps of:
receiving a register message from said cache that have cached said content, wherein said register message includes an identification of said cache and said content; and
updating said register based on said register message.
15. The method as claimed in claim 13 or 14 further comprising the steps of:
receiving an unregister message from said cache that have cached said content, wherein said unregister message includes an identification of said cache and of content that said cache has removed from said cache; and
updating said register based on said register message.
16. The method as claimed in any one of claims 12 to 15 in which said step of redirecting said request for content to said cache further comprises:
identifying an address of said cache that has cached said content and redirecting said request for content to said identified address.
17. The method as claimed in any one of claims 12 to 16 further comprising the steps of:
monitoring said request for content;
determining whether to proactively cache said content based on one or more policies; and
transmitting an instruction message to one or more of said caches to cache said content if said determination to proactively cache said content is positive.
18. An apparatus comprising:
a first input adapted to receive a request for content;
a first processor adapted to determine whether said content has been proactively cached in a cache; and
a first output adapted to redirect said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
19. The apparatus as claimed in claim 18 in which said first processor is further adapted to check a register, wherein said register maintains an indication of content that has been cached.
20. The apparatus as claimed in claim 19 further comprising:
a second input adapted to receive a register message from said cache that have cached said content, wherein said register message includes an identification of said cache and said content; and
a second processor adapted to update said register based on said register message.
21 . The apparatus as claimed in claim 19 or 20 further comprising:
a third input adapted to receive an unregister message from said cache that have cached said content, wherein said unregister message includes an identification of said cache and of content that said cache has removed from said cache; and
a third processor adapted to update said register based on said register message.
22. The apparatus as claimed in any one of claims 18 to 21 further comprises:
a fourth processor adapted to identify an address of said cache that has cached said content and said first output is further adapted to redirect said request for content to said identified address.
23. The apparatus as claimed in any one of claims 18 to 22 further comprising:
a fifth processor adapted to monitor said request for content;
a sixth processor adapted to determine whether to proactively cache said content based on one or more policies; and
a second output adapted to transmit an instruction message to one or more of said caches to cache said content if said determination to proactively cache said content is positive.
24. A computer program product comprising computer readable executable code for: receiving a request for content; determining whether said content has been proactively cached in a cache; and redirecting said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
25. A method comprising the steps of:
receiving a first instruction message from a monitoring function wherein said first instruction message includes an instruction to proactively cache content;
transmitting a request for said content based on said first instruction message; receiving said requested content; and
transmitting a register message to a register function so as to register said proactively cached content with said register.
26. The method as claimed in claim 25 in which said request for said content is transmitted to a content provider or a cache that has cached said content.
27. The method as claimed in claim 25 or 26 further comprising the steps of:
transmitting a second instruction message to one or more cache to instruct said one or more cache to proactively cache said content.
28. An apparatus comprising:
a first input adapted to receive a first instruction message from a monitoring function wherein said first instruction message includes an instruction to proactively cache content;
a first output adapted to transmit a request for said content based on said first instruction message;
a second input adapted to receive said requested content; and
a second output adapted to transmit a register message to a register function so as to register said proactively cached content with said register.
29. The apparatus as claimed in claim 28 in which said first output is further adapted to transmit said request for said content to a content provider or a cache that has cached said content.
30. The apparatus as claimed in claim 28 or 29 further comprising:
a third output adapted to transmit a second instruction message to one or more cache to instruct said one or more cache to proactively cache said content.
31 . A computer program product comprising computer readable executable code for: receiving a first instruction message from a monitoring function wherein said first instruction message includes an instruction to proactively cache content;
transmitting a request for said content based on said first instruction message; receiving said requested content; and
transmitting a register message to a register function so as to register said proactively cached content with said register.
32. A method comprising the steps of: monitoring one or more requests for content;
determining whether to proactively cache said content based on one or more first policies and transmitting an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive;
wherein if said determination to proactively cache said content is negative then determining whether said content has been proactively cached in a cache and redirecting said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
33. An apparatus comprising:
a first processor adapted to monitor one or more requests for content;
a second processor adapted to determine whether to proactively cache said content based on one or more first policies and a first output adapted to transmit an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive;
wherein if said determination to proactively cache said content is negative then said apparatus further comprises:
a third processor adapted to determine whether said content has been proactively cached in a cache and a second output adapted to redirect said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
34. A computer program product comprising computer readable executable code for: monitoring one or more requests for content;
determining whether to proactively cache said content based on one or more first policies and transmitting an instruction message to one or more caches to cache said content if said determination to proactively cache said content is positive;
wherein if said determination to proactively cache said content is negative then determining whether said content has been proactively cached in a cache and redirecting said request for content to said cache if said determination that said content has been proactively cached in a cache is positive.
PCT/EP2011/061116 2011-07-01 2011-07-01 Caching content WO2013004268A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/061116 WO2013004268A1 (en) 2011-07-01 2011-07-01 Caching content

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/061116 WO2013004268A1 (en) 2011-07-01 2011-07-01 Caching content

Publications (1)

Publication Number Publication Date
WO2013004268A1 true WO2013004268A1 (en) 2013-01-10

Family

ID=44584138

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/061116 WO2013004268A1 (en) 2011-07-01 2011-07-01 Caching content

Country Status (1)

Country Link
WO (1) WO2013004268A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747054B (en) * 2013-12-26 2017-04-12 福建伊时代信息科技股份有限公司 Network data distribution device and system having the same
US10798167B2 (en) 2015-11-25 2020-10-06 International Business Machines Corporation Storage enhanced intelligent pre-seeding of information

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10008949A1 (en) * 2000-02-25 2001-10-04 Tellique Kommunikationstechnik Prestorage method for computer network information, involves storing computer network information in central cache memory so that information is available for call-up with computer
US20030195940A1 (en) * 2002-04-04 2003-10-16 Sujoy Basu Device and method for supervising use of shared storage by multiple caching servers

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10008949A1 (en) * 2000-02-25 2001-10-04 Tellique Kommunikationstechnik Prestorage method for computer network information, involves storing computer network information in central cache memory so that information is available for call-up with computer
US20030195940A1 (en) * 2002-04-04 2003-10-16 Sujoy Basu Device and method for supervising use of shared storage by multiple caching servers

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747054B (en) * 2013-12-26 2017-04-12 福建伊时代信息科技股份有限公司 Network data distribution device and system having the same
US10798167B2 (en) 2015-11-25 2020-10-06 International Business Machines Corporation Storage enhanced intelligent pre-seeding of information

Similar Documents

Publication Publication Date Title
US11509741B2 (en) Managing mobile device user subscription and service preferences to predictively pre-fetch content
JP7252356B2 (en) MOBILE EDGE COMPUTING NODE SELECTION METHOD, APPARATUS AND SYSTEM AND COMPUTER PROGRAM
CN106031130B (en) Content distribution network framework with edge proxies
CN107251525B (en) Distributed server architecture for supporting predictive content pre-fetching services for mobile device users
US11533261B2 (en) Download management with congestion mitigation for over the air content delivery to vehicles
US9588854B2 (en) Systems and methods for a secondary website with mirrored content for automatic failover
US20140222967A1 (en) Transparent media delivery and proxy
US20150046591A1 (en) Dynamic edge server allocation
CN110381162B (en) Service processing method and related device
KR20120083903A (en) System and method for providing faster and more efficient data communication
KR20110053906A (en) Method and system for optimization of multimedia service over ims network
CN107135268B (en) Distributed task computing method based on information center network
CN105359490A (en) User authentication in a cloud environment
US11159642B2 (en) Site and page specific resource prioritization
CN110830564A (en) CDN scheduling method, device, system and computer readable storage medium
Ibn-Khedher et al. OPAC: An optimal placement algorithm for virtual CDN
Braun et al. Service-centric networking extensions
US20150215187A1 (en) Data Services in a Computer System
Hashemi et al. Analytical modeling of multi-source content delivery in information-centric networks
Bagies et al. Content delivery network for IoT-based Fog Computing environment
WO2013004268A1 (en) Caching content
Seyyed Hashemi et al. Analytical characterization of cache replacement policy impact on content delivery time in information‐centric networks
CN115883657A (en) Cloud disk service accelerated scheduling method and system
Meng et al. Elastic caching solutions for content dissemination services of ip-based internet technologies prospective
WO2017045438A1 (en) Content caching method, apparatus, and system in mobile network decision cdn

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11729115

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11729115

Country of ref document: EP

Kind code of ref document: A1