WO2018109270A1 - Apparatus, computer program and method for controlling media system of meeting space - Google Patents

Apparatus, computer program and method for controlling media system of meeting space Download PDF

Info

Publication number
WO2018109270A1
WO2018109270A1 PCT/FI2017/050878 FI2017050878W WO2018109270A1 WO 2018109270 A1 WO2018109270 A1 WO 2018109270A1 FI 2017050878 W FI2017050878 W FI 2017050878W WO 2018109270 A1 WO2018109270 A1 WO 2018109270A1
Authority
WO
WIPO (PCT)
Prior art keywords
meeting
user
meeting space
mobile apparatus
mobile
Prior art date
Application number
PCT/FI2017/050878
Other languages
French (fr)
Inventor
Jorma KIVELÄ
Original Assignee
Jutel Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jutel Oy filed Critical Jutel Oy
Publication of WO2018109270A1 publication Critical patent/WO2018109270A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1818Conference organisation arrangements, e.g. handling schedules, setting up parameters needed by nodes to attend a conference, booking network resources, notifying involved parties

Definitions

  • the invention relates to an apparatus, a computer program, and a method. Background
  • the present invention seeks to provide an improved apparatus, computer program and method for controlling a media system of a meeting space.
  • Figure 1 illustrates example embodiments of an apparatus and a mobile apparatus
  • Figure 2 illustrates further example embodiments of the apparatus and its operating environment
  • Figures 3 and 4 are flow-charts illustrating example embodiments of a method. Description of embodiments
  • Figure 1 illustrates example embodiments of an apparatus 100 and a mobile apparatus 140
  • Figure 2 illustrates further example embodiments of the apparatus 100 and its operating environment.
  • the apparatus 100 is a computing device.
  • the computing device 100 may be portable, mobile or stationary. It may be an independent apparatus or it may be more or less integrated with another object, such as a media system 132.
  • a non-limiting list of example embodiments of the computing device 100 comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, or any other portable/mobile/stationary computing device capable of controlling a media system 132 of a meeting space 130.
  • the apparatus 100 is a personal computing device operated by a chair/operator 220 of a meeting, or a technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130.
  • the apparatus 100 is a computing server.
  • the computing server 100 may be implemented with any applicable technology. It may include one or more centralized computing apparatuses 100, or it may include more than one distributed computing apparatuses 100. It may be implemented with client-server technology, or in a cloud computing environment, or with another technology applicable to the computing server 100 capable of controlling the media system 132 of the meeting space 130.
  • the mobile apparatus 140, 140A, 140B is a personal computing device of a user 160, 160A, 160B. It may be portable.
  • a non- limiting list of example embodiments of the mobile apparatus 140, 140A, 140B comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, a smartwatch, smartglasses, or any other portable/mobile computing device, which may be manipulated by the user 160, 160A, 160B.
  • the apparatus 100 and/or the mobile apparatus 140, 140A, 140B is a general-purpose off-the-shelf computing device, as opposed to a purpose-build proprietary equipment, whereby research & development costs will be lower as only the special-purpose software (and not the hardware) needs to be designed, implemented and tested.
  • the apparatus 100 and/or the mobile apparatus 140, 140A, 140B employs a suitable operating system such as Windows, iOS or Android, for example.
  • a suitable operating system such as Windows, iOS or Android, for example.
  • the apparatus 100 comprises one or more processors 102, and one or more memories 104 including computer program code 106.
  • the mobile apparatus 140 also comprises one or more processors 142, and one or more memories 144 including computer program code 146.
  • the term 'processor' 102, 142 refers to a device that is capable of processing data.
  • the apparatus 100 or the mobile apparatus 140, 140A, 140B may comprise several processors 102, 142 such as parallel processors or a multicore processor.
  • processors 102, 142 such as parallel processors or a multicore processor.
  • the working memory and the non-volatile memory may be implemented by a random-access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), a flash memory, a solid state disk (SSD), PROM (programmable read-only memory), a suitable semiconductor, or any other means of implementing an electrical computer memory.
  • the processor 102, 142 and the memory 104, 144 may be implemented by an electronic circuitry.
  • a non-exhaustive list of implementation techniques for the processor 102, 142 and the memory 104, 144 includes, but is not limited to: logic components, standard integrated circuits, application-specific integrated circuits (ASIC), system-on-a-chip (SoC), application-specific standard products (ASSP), microprocessors, microcontrollers, digital signal processors, special-purpose computer chips, field-programmable gate arrays (FPGA), and other suitable electronics structures.
  • the computer program code 106, 146 may be implemented by software and/or hardware.
  • the software may be written by a suitable programming language (a high-level programming language, such as C, C++, or Java, or a low-level programming language, such as a machine language, or an assembler, for example), and the resulting executable code 106, 146 may be stored on the memory 104, 144 and run by the processor 102, 142.
  • a suitable programming language a high-level programming language, such as C, C++, or Java
  • a low-level programming language such as a machine language, or an assembler, for example
  • the functionality of the hardware may be designed by a suitable hardware description language (such as Verilog or VHDL), and transformed into a gate-level netlist (describing standard cells and the electrical connections between them), and after further phases the chip implementing the processor 102, 142, memory 104, 144 and the code 106, 146 of the apparatus 100 or the mobile apparatus 140, 140A, 140B may be fabricated with photo masks describing the circuitry.
  • a suitable hardware description language such as Verilog or VHDL
  • VHDL gate-level netlist
  • An example embodiment provides a computer-readable medium 112 comprising the computer program code 106, which, when loaded into the apparatus 100 and executed by the apparatus 100 causes the apparatus 100 to perform processing of the example embodiments.
  • An example embodiment provides a computer-readable medium 154 comprising the computer program code 146, which, when loaded into the mobile apparatus 140, 140A, 140B and executed by the mobile apparatus 140, 140A, 140B causes the mobile apparatus 140, 140A, 140B to perform processing of the example embodiments.
  • the operations of the computer program code 106, 146 may be divided into functional modules, sub-routines, methods, classes, objects, applets, macros, etc., depending on the software design methodology and the programming language used. In modern programming environments, there are software libraries, i.e. compilations of ready-made functions, which may be utilized by the computer program code 106, 146 for performing a wide variety of standard operations.
  • the computer program code 106, 146 may be in source code form, object code form, executable file, or in some intermediate form.
  • the computer-readable medium 112, 154 may comprise at least the following: any entity or device capable of carrying the computer program code 106, 146 to the apparatus 100 or to the mobile apparatus 140, 140A, 140B, a record medium, a computer memory, a read-only memory, an electrical carrier signal, a telecommunications signal, and a software distribution medium. In some jurisdictions, depending on the legislation and the patent practice, the computer-readable medium 112, 154 may not be the telecommunications signal. In an example embodiment, the computer-readable medium 112, 154 may be a non-transitory computer-readable storage medium.
  • the mobile apparatus 140, 140A, 140B comprises a user interface 150 implementing the exchange 164 of graphical, textual and/or auditory information with the user 160, 160A, 160B.
  • the user interface 150 may be used to perform required user actions in relation to controlling the media system 132 of the meeting space 130.
  • the user interface 150 may be realized with various techniques, such as a (multi-touch) display, means for producing sound, a keyboard, and/or a keypad, for example.
  • the means for producing sound may be a loudspeaker or a simpler means for producing beeps or other sound signals.
  • the keyboard/keypad may comprise a complete (QWERTY) keyboard, a mere numeric keypad or only a few push buttons and/or rotary buttons.
  • the user interface may comprise other user interface components, for example various means for focusing a cursor (mouse, track ball, arrow keys, touch sensitive area etc.) or elements enabling audio control.
  • the mobile apparatus 140, 140A, 140B comprises a radio transceiver 152 implementing the communication with the apparatus 100.
  • the radio transceiver 152 comprises a cellular radio transceiver (communicating with technologies such as GSM, GPRS, EGPRS, WCDMA, UMTS, 3GPP, IMT, LTE, LTE-A, etc.) and/or a non-cellular radio transceiver (communicating with technologies such as Bluetooth, Bluetooth Low Energy, Wi-Fi, WLAN, etc.).
  • the person 160, 160A, 160B enters the meeting space 130 in order to participate in a meeting.
  • the user 160, 160A, 160B is carrying his/her personal mobile apparatus 140, 140A, 140B, or, alternatively, the user 160, 160A, 160B is given a mobile apparatus 140, 140A, 140B for use during the meeting.
  • the one or more memories 104 and the computer program code 106 of the apparatus 100 are configured to, with the one or more processors 102 of the apparatus 100, cause the apparatus 100 at least to perform the following four- step sequence of operations:
  • the user detector 108 may be implemented with any technology suitable for detecting the user 160, 160A, 160B, and/or his/her mobile apparatus 140, 140A, 140B.
  • the user meeting rights may determine the role (attendant, speaker, listener, etc.) of the user 160, 160A, 160B in the meeting, and, consequently, allowed actions (speak, vote, etc.).
  • the meeting service request may have been generated by an interaction of the user 160, 160A, 160B with a meeting application 148 in his/her mobile apparatus 140, 140A, 140B.
  • the media system 132 of the meeting space 130 may comprise loudspeakers, displays and/or other apparatus presenting audio and/or visual information to the attendees of the meeting.
  • the apparatus 100 may interact or even be a part of an Internet of Sound® system developed by the applicant, Jutel Oy, for providing an IP network based media processing and delivery.
  • These operations 120-122-124-126 implement a method illustrated in Figure 3, which starts in 300.
  • the method ends in 318 after the processing is finished, or, it may be looped back to the beginning, and the processing of the next user may be started from the operation 120, or looped back to the operation 124, and the processing of the next meeting service request may be started.
  • Further example embodiments are illustrated in Figure 4.
  • the operations are not strictly in chronological order, and some of the operations may be performed simultaneously or in an order differing from the given ones. Other functions may also be executed between the operations or within the operations and other data exchanged between the operations. Some of the operations or part of the operations may also be left out or replaced by a corresponding operation or a part of the operation. It should be noted that no special order of operations is required, except where necessary due to the logical requirements for the processing order.
  • the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 302 of the user 160, 160A, 160B in the meeting space 130.
  • computer vision may be used to detect the user 160, 160A, 160B, and identify him/her with aid of a database 114 storing user data, such as facial images of users 160, 160A, 160B.
  • the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130.
  • the presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130 may be detected with any suitable means for locating the mobile apparatus 140, 140A, 140B such as by location services 206, detecting that the location of the mobile apparatus 140A, 140A, 140B is in the meeting space 130 or in its immediate vicinity, for example.
  • the mobile apparatus 140, 140A, 140B may also be detected by its address (such as WLAN address, or Bluetooth MAC address).
  • the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by analyzing a code 306 received from the mobile apparatus 140, 140A, 140B.
  • the mobile apparatus 140, 140A, 140B transmits the code 306 to the apparatus 100 through a wireless network 202 or a media network 204, after the user 160, 160A, 160B has obtained the code, by receiving the code 306 in an e-mail message, in a text message, or after having perceived (seen or heard) the code 306.
  • the code 306 may be displayed in the meeting space 130, for example, and the user may read it or photograph it (such as QR code).
  • the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting an interaction of the mobile apparatus 140, 140A, 140B with a discovery protocol 308 broadcasted in the meeting space 130.
  • the discovery protocol 308 may be a part of a zero-configuration networking, wherein the mobile apparatus 140, 140A, 140B is able to connect to a computer network 202, 204 without a manual operator intervention or a special configuration server.
  • the discovery protocol 308 is Bonjour, which provides service discovery, address assignment, and hostname resolution.
  • the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by receiving data from the mobile apparatus 140, wherein the data somehow identifies the meeting space 130.
  • a camera 158 of the mobile apparatus 140 captures live video of a display 214 in the meeting space 130, and by analyzing the live video, the meeting space 130 is identified. The identification may be based on recognizing a certain feature from the video, such as an image or a series of images or characters shown in the display 214, a certain pattern of colours shown in the display 214, a certain pattern of flickering shown in the display 214, etc.
  • Another example embodiment provides a gesture performed with the mobile apparatus 140, which identifies the meeting space 130.
  • the meeting space 130 is announced (by an image or by an announcement, for example) that the meeting space identification is number eight, and the user 160 moves his/her mobile apparatus 140 in air to form a figure eight, which is noted with an inertial measurement unit (including an acceleration sensor and possibly also a magnetometer and/or a gyroscope) of the mobile apparatus 140, and transmitted to the apparatus 100 as an identification from a meeting application 148 of the mobile apparatus 140.
  • an inertial measurement unit including an acceleration sensor and possibly also a magnetometer and/or a gyroscope
  • the apparatus 100 is further caused to detect 314 an exit of the user 160, 160A, 160B from the meeting space 130, and transmit 316 a meeting service termination order to the mobile apparatus 140, 140A, 140B.
  • the exit of the user 160, 160A, 160B may be detected with applying similar technologies as for detecting the arrival of the user 160, 160A, 160B
  • the apparatus 100 is further caused to receive 400 a comment request as the service request 124, transmit 402 the comment request to a moderator 220 of the meeting, receive 404 a decision regarding the comment request from the moderator 220, and transmit 406 the decision regarding the comment request to the mobile apparatus 140A.
  • the comment request may be generated by the user 160A performing an appropriate user action (such as pressing a button labelled "REQUEST COMMENT") in the meeting application 148 present in the mobile apparatus 140A. This mechanism may also be utilized for other applicable meeting services such as written comments, voting, etc.
  • the apparatus 100 is further caused to, if the decision regarding the comment request is affirmative 408 YES, transmit 410 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch speech of the user 160A with a microphone 156 coupled with the mobile apparatus 140A and transmit 412 the speech wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 416 the speech and output it through at least one audio output apparatus 212 located in the meeting space 130.
  • the microphone 156 is a built-in microphone of the mobile apparatus 140A.
  • the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 418 the speech wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130.
  • This example embodiment may augment or even replace the use of general loudspeakers 212 in the meeting space 130 as each user 160B may listen to the speech in his/her own mobile apparatus 140B, from a loudspeaker of the mobile apparatus 140B or from a (wired or wireless) earpiece coupled with the mobile apparatus 140B, for example.
  • the apparatus 100 is further caused to transmit 422 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch video of the user 160A with a camera 158 coupled with the mobile apparatus 140 and transmit 424 the video wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 426 the video and output it through at least one video output apparatus 214 located in the meeting space 130.
  • the camera 158 is a built-in camera of the mobile apparatus 140A.
  • the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 420 the video wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130.
  • This example embodiment may augment or even replace the use of general displays 214 in the meeting space 130 as each user 160B may see the video in his/her own mobile apparatus 140B, in a display 150 of the mobile apparatus 140B, for example.
  • the apparatus 100 is further caused to determine 122 the user meeting rights of the user 160, 160A, 160B associated with the mobile apparatus 140, 140A, 140B in a form of software configuration information 310, transmit 312 the software configuration information to the mobile apparatus 140, 140A, 140B, and receive 124 the service request from an application 148 configured with the software configuration information 310 in the mobile apparatus 140, 140A, 140B.
  • the user 160, 160A, 160B may obtain beforehand a meeting application 148, which is then configured on-the-go with the software configuration information 310 to support various meeting spaces 130 as required.
  • the meeting application 148 may be a so-called stub, i.e., basic software with a communication interface and a configurator, which enables flexible configuration of the meeting application 148.
  • the meeting application 148 may become various functionalities (and their help functions) with the software configuration information 310. These functionalities may already exist in the mobile apparatus 140 and/or they may be downloaded from an external source such as the apparatus 100.
  • the attendants 160A, 160B of the meeting each have an attendant user interface 230A, 230B for interacting with the apparatus 100 and the meeting services provided by the apparatus 100.
  • the chair/operator 220 of the meeting has a control user interface 226 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100, and also a microphone 222 coupled with a microphone amplifier 224.
  • the technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130 has a technician's user interface 218 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100.
  • the roles of the technician 216 and the chair/operator 220 may also be combined.
  • a data network 200 may be a wired network, utilizing copper or optical fibre, for example.
  • the wired network 200 may even supply power (Power Over Ethernet, POE).
  • POE Power Over Ethernet
  • a wireless network 202 may be a wireless local area network and/or a mobile network.
  • a media network 204 may exist instead of or in addition to the wireless network 202 and it may be based on Bluetooth or some other wireless technology.
  • the location services 206 may utilize various location technologies, such as satellite-based location, cellular network location, indoor radio beacon- based location, indoor magnetic positioning, etc.
  • the location services 206 may also utilize other electronic or optical mechanisms.
  • the location services 206 may also utilize an electronic seating arrangement management system.
  • the apparatus 100 may be coupled to external cloud services 210, which may further enhance the services provided for the meeting in the meeting space 130.
  • a firewall 208 may be utilized to protect the apparatus 100 from unauthorized traffic from and to the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Apparatus, computer program and method for controlling media system of meeting space. The apparatus (100) operates as follows: detect (120) an arrival of a user (160, 160A, 160B) to a meeting space (130); determine (122) user meeting rights of the user (160, 160A, 160B) associated with a mobile apparatus (140, 140A, 140B); receive (124) a meeting service request from the mobile apparatus (140, 140A, 140B); and process (126) the received meeting service request according to the determined user meeting rights to control a media system (132) of the meeting space (130).

Description

Apparatus, computer program and method for controlling media system of meeting space
Field
The invention relates to an apparatus, a computer program, and a method. Background
Although video conferences have become commonplace, traditional meetings still also have their place in training and business, for example. As participants have accustomed themselves with tools implementing video conferences, further sophistication is desired to enhance a traditional meeting.
Brief description
The present invention seeks to provide an improved apparatus, computer program and method for controlling a media system of a meeting space.
According to an aspect of the present invention, there is provided an apparatus as specified in claim 1.
According to another aspect of the present invention, there is provided a computer program as specified in claim 11.
According to another aspect of the present invention, there is provided a method as specified in claim 12. List of drawings
Example embodiments of the present invention are described below, by way of example only, with reference to the accompanying drawings, in which
Figure 1 illustrates example embodiments of an apparatus and a mobile apparatus;
Figure 2 illustrates further example embodiments of the apparatus and its operating environment; and
Figures 3 and 4 are flow-charts illustrating example embodiments of a method. Description of embodiments
The following embodiments are only examples. Although the specification may refer to "an" embodiment in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments. Furthermore, words "comprising" and "including" should be understood as not limiting the described embodiments to consist of only those features that have been mentioned and such embodiments may contain also features/structures that have not been specifically mentioned.
Figure 1 illustrates example embodiments of an apparatus 100 and a mobile apparatus 140, and Figure 2 illustrates further example embodiments of the apparatus 100 and its operating environment.
In an example embodiment, the apparatus 100 is a computing device. The computing device 100 may be portable, mobile or stationary. It may be an independent apparatus or it may be more or less integrated with another object, such as a media system 132. A non-limiting list of example embodiments of the computing device 100 comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, or any other portable/mobile/stationary computing device capable of controlling a media system 132 of a meeting space 130.
In an example embodiment, the apparatus 100 is a personal computing device operated by a chair/operator 220 of a meeting, or a technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130.
In an example embodiment, the apparatus 100 is a computing server. The computing server 100 may be implemented with any applicable technology. It may include one or more centralized computing apparatuses 100, or it may include more than one distributed computing apparatuses 100. It may be implemented with client-server technology, or in a cloud computing environment, or with another technology applicable to the computing server 100 capable of controlling the media system 132 of the meeting space 130.
In an example embodiment, the mobile apparatus 140, 140A, 140B is a personal computing device of a user 160, 160A, 160B. It may be portable. A non- limiting list of example embodiments of the mobile apparatus 140, 140A, 140B comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, a smartwatch, smartglasses, or any other portable/mobile computing device, which may be manipulated by the user 160, 160A, 160B.
In an example embodiment, the apparatus 100 and/or the mobile apparatus 140, 140A, 140B is a general-purpose off-the-shelf computing device, as opposed to a purpose-build proprietary equipment, whereby research & development costs will be lower as only the special-purpose software (and not the hardware) needs to be designed, implemented and tested.
In an example embodiment, the apparatus 100 and/or the mobile apparatus 140, 140A, 140B employs a suitable operating system such as Windows, iOS or Android, for example.
The apparatus 100 comprises one or more processors 102, and one or more memories 104 including computer program code 106.
The mobile apparatus 140 also comprises one or more processors 142, and one or more memories 144 including computer program code 146.
The term 'processor' 102, 142 refers to a device that is capable of processing data. Depending on the processing power needed, the apparatus 100 or the mobile apparatus 140, 140A, 140B may comprise several processors 102, 142 such as parallel processors or a multicore processor. When designing the implementation of the processor 102, 142 a person skilled in the art will consider the requirements set for the size and power consumption of the apparatus 100 or the mobile apparatus 140, 140A, 140B, the necessary processing capacity, production costs, and production volumes, for example.
The term 'memory' 104, 144 refers to a device that is capable of storing data run-time (= working memory) or permanently (= non-volatile memory). The working memory and the non-volatile memory may be implemented by a random-access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), a flash memory, a solid state disk (SSD), PROM (programmable read-only memory), a suitable semiconductor, or any other means of implementing an electrical computer memory.
The processor 102, 142 and the memory 104, 144 may be implemented by an electronic circuitry. A non-exhaustive list of implementation techniques for the processor 102, 142 and the memory 104, 144 includes, but is not limited to: logic components, standard integrated circuits, application-specific integrated circuits (ASIC), system-on-a-chip (SoC), application-specific standard products (ASSP), microprocessors, microcontrollers, digital signal processors, special-purpose computer chips, field-programmable gate arrays (FPGA), and other suitable electronics structures.
The computer program code 106, 146 may be implemented by software and/or hardware. In an example embodiment, the software may be written by a suitable programming language (a high-level programming language, such as C, C++, or Java, or a low-level programming language, such as a machine language, or an assembler, for example), and the resulting executable code 106, 146 may be stored on the memory 104, 144 and run by the processor 102, 142. In an alternative example embodiment, the functionality of the hardware may be designed by a suitable hardware description language (such as Verilog or VHDL), and transformed into a gate-level netlist (describing standard cells and the electrical connections between them), and after further phases the chip implementing the processor 102, 142, memory 104, 144 and the code 106, 146 of the apparatus 100 or the mobile apparatus 140, 140A, 140B may be fabricated with photo masks describing the circuitry.
An example embodiment provides a computer-readable medium 112 comprising the computer program code 106, which, when loaded into the apparatus 100 and executed by the apparatus 100 causes the apparatus 100 to perform processing of the example embodiments. An example embodiment provides a computer-readable medium 154 comprising the computer program code 146, which, when loaded into the mobile apparatus 140, 140A, 140B and executed by the mobile apparatus 140, 140A, 140B causes the mobile apparatus 140, 140A, 140B to perform processing of the example embodiments.
In an example embodiment, the operations of the computer program code 106, 146 may be divided into functional modules, sub-routines, methods, classes, objects, applets, macros, etc., depending on the software design methodology and the programming language used. In modern programming environments, there are software libraries, i.e. compilations of ready-made functions, which may be utilized by the computer program code 106, 146 for performing a wide variety of standard operations. In an example embodiment, the computer program code 106, 146 may be in source code form, object code form, executable file, or in some intermediate form. The computer-readable medium 112, 154 may comprise at least the following: any entity or device capable of carrying the computer program code 106, 146 to the apparatus 100 or to the mobile apparatus 140, 140A, 140B, a record medium, a computer memory, a read-only memory, an electrical carrier signal, a telecommunications signal, and a software distribution medium. In some jurisdictions, depending on the legislation and the patent practice, the computer-readable medium 112, 154 may not be the telecommunications signal. In an example embodiment, the computer-readable medium 112, 154 may be a non-transitory computer-readable storage medium.
In an example embodiment, the mobile apparatus 140, 140A, 140B comprises a user interface 150 implementing the exchange 164 of graphical, textual and/or auditory information with the user 160, 160A, 160B. The user interface 150 may be used to perform required user actions in relation to controlling the media system 132 of the meeting space 130. The user interface 150 may be realized with various techniques, such as a (multi-touch) display, means for producing sound, a keyboard, and/or a keypad, for example. The means for producing sound may be a loudspeaker or a simpler means for producing beeps or other sound signals. The keyboard/keypad may comprise a complete (QWERTY) keyboard, a mere numeric keypad or only a few push buttons and/or rotary buttons. In addition, or alternatively, the user interface may comprise other user interface components, for example various means for focusing a cursor (mouse, track ball, arrow keys, touch sensitive area etc.) or elements enabling audio control.
In an example embodiment, the mobile apparatus 140, 140A, 140B comprises a radio transceiver 152 implementing the communication with the apparatus 100. In an example embodiment, the radio transceiver 152 comprises a cellular radio transceiver (communicating with technologies such as GSM, GPRS, EGPRS, WCDMA, UMTS, 3GPP, IMT, LTE, LTE-A, etc.) and/or a non-cellular radio transceiver (communicating with technologies such as Bluetooth, Bluetooth Low Energy, Wi-Fi, WLAN, etc.).
Now that the basic equipment, the apparatus 100 and the mobile apparatus 140, 140A, 140B, has been described, let us proceed to describe an example embodiment of a typical use case.
The person 160, 160A, 160B enters the meeting space 130 in order to participate in a meeting. The user 160, 160A, 160B is carrying his/her personal mobile apparatus 140, 140A, 140B, or, alternatively, the user 160, 160A, 160B is given a mobile apparatus 140, 140A, 140B for use during the meeting.
The one or more memories 104 and the computer program code 106 of the apparatus 100 are configured to, with the one or more processors 102 of the apparatus 100, cause the apparatus 100 at least to perform the following four- step sequence of operations:
120) Detect an arrival 170/172 of the user 160, 160A, 160B to the meeting space 130. The user detector 108 may be implemented with any technology suitable for detecting the user 160, 160A, 160B, and/or his/her mobile apparatus 140, 140A, 140B.
122) Determine user meeting rights of the user 160, 160A, 160B associated 162 with the mobile apparatus 140, 140A, 140B. The user meeting rights may determine the role (attendant, speaker, listener, etc.) of the user 160, 160A, 160B in the meeting, and, consequently, allowed actions (speak, vote, etc.). 124) Receive a meeting service request 174 from the mobile apparatus 140, 140A, 140B. The meeting service request may have been generated by an interaction of the user 160, 160A, 160B with a meeting application 148 in his/her mobile apparatus 140, 140A, 140B.
126) Process the received meeting service request according to the determined user meeting rights to control 176 the media system 132 of the meeting space 130. The media system 132 of the meeting space 130 may comprise loudspeakers, displays and/or other apparatus presenting audio and/or visual information to the attendees of the meeting.
In an example embodiment, the apparatus 100 may interact or even be a part of an Internet of Sound® system developed by the applicant, Jutel Oy, for providing an IP network based media processing and delivery.
These operations 120-122-124-126 implement a method illustrated in Figure 3, which starts in 300. The method ends in 318 after the processing is finished, or, it may be looped back to the beginning, and the processing of the next user may be started from the operation 120, or looped back to the operation 124, and the processing of the next meeting service request may be started. Further example embodiments are illustrated in Figure 4. The operations are not strictly in chronological order, and some of the operations may be performed simultaneously or in an order differing from the given ones. Other functions may also be executed between the operations or within the operations and other data exchanged between the operations. Some of the operations or part of the operations may also be left out or replaced by a corresponding operation or a part of the operation. It should be noted that no special order of operations is required, except where necessary due to the logical requirements for the processing order.
In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 302 of the user 160, 160A, 160B in the meeting space 130. In an example embodiment, computer vision may be used to detect the user 160, 160A, 160B, and identify him/her with aid of a database 114 storing user data, such as facial images of users 160, 160A, 160B.
In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130. The presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130 may be detected with any suitable means for locating the mobile apparatus 140, 140A, 140B such as by location services 206, detecting that the location of the mobile apparatus 140A, 140A, 140B is in the meeting space 130 or in its immediate vicinity, for example. The mobile apparatus 140, 140A, 140B may also be detected by its address (such as WLAN address, or Bluetooth MAC address).
In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by analyzing a code 306 received from the mobile apparatus 140, 140A, 140B. In this example embodiment, the mobile apparatus 140, 140A, 140B transmits the code 306 to the apparatus 100 through a wireless network 202 or a media network 204, after the user 160, 160A, 160B has obtained the code, by receiving the code 306 in an e-mail message, in a text message, or after having perceived (seen or heard) the code 306. The code 306 may be displayed in the meeting space 130, for example, and the user may read it or photograph it (such as QR code).
In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting an interaction of the mobile apparatus 140, 140A, 140B with a discovery protocol 308 broadcasted in the meeting space 130. The discovery protocol 308 may be a part of a zero-configuration networking, wherein the mobile apparatus 140, 140A, 140B is able to connect to a computer network 202, 204 without a manual operator intervention or a special configuration server. In an example embodiment, the discovery protocol 308 is Bonjour, which provides service discovery, address assignment, and hostname resolution. In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by receiving data from the mobile apparatus 140, wherein the data somehow identifies the meeting space 130. For example: a camera 158 of the mobile apparatus 140 captures live video of a display 214 in the meeting space 130, and by analyzing the live video, the meeting space 130 is identified. The identification may be based on recognizing a certain feature from the video, such as an image or a series of images or characters shown in the display 214, a certain pattern of colours shown in the display 214, a certain pattern of flickering shown in the display 214, etc. Another example embodiment provides a gesture performed with the mobile apparatus 140, which identifies the meeting space 130. For example: in the meeting space 130 is announced (by an image or by an announcement, for example) that the meeting space identification is number eight, and the user 160 moves his/her mobile apparatus 140 in air to form a figure eight, which is noted with an inertial measurement unit (including an acceleration sensor and possibly also a magnetometer and/or a gyroscope) of the mobile apparatus 140, and transmitted to the apparatus 100 as an identification from a meeting application 148 of the mobile apparatus 140.
In an example embodiment, the apparatus 100 is further caused to detect 314 an exit of the user 160, 160A, 160B from the meeting space 130, and transmit 316 a meeting service termination order to the mobile apparatus 140, 140A, 140B. The exit of the user 160, 160A, 160B may be detected with applying similar technologies as for detecting the arrival of the user 160, 160A, 160B
In an example embodiment, the apparatus 100 is further caused to receive 400 a comment request as the service request 124, transmit 402 the comment request to a moderator 220 of the meeting, receive 404 a decision regarding the comment request from the moderator 220, and transmit 406 the decision regarding the comment request to the mobile apparatus 140A. The comment request may be generated by the user 160A performing an appropriate user action (such as pressing a button labelled "REQUEST COMMENT") in the meeting application 148 present in the mobile apparatus 140A. This mechanism may also be utilized for other applicable meeting services such as written comments, voting, etc.
In an example embodiment, the apparatus 100 is further caused to, if the decision regarding the comment request is affirmative 408 YES, transmit 410 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch speech of the user 160A with a microphone 156 coupled with the mobile apparatus 140A and transmit 412 the speech wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 416 the speech and output it through at least one audio output apparatus 212 located in the meeting space 130. In an example embodiment, the microphone 156 is a built-in microphone of the mobile apparatus 140A.
In an example embodiment, the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 418 the speech wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130. This example embodiment may augment or even replace the use of general loudspeakers 212 in the meeting space 130 as each user 160B may listen to the speech in his/her own mobile apparatus 140B, from a loudspeaker of the mobile apparatus 140B or from a (wired or wireless) earpiece coupled with the mobile apparatus 140B, for example.
In an example embodiment, the apparatus 100 is further caused to transmit 422 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch video of the user 160A with a camera 158 coupled with the mobile apparatus 140 and transmit 424 the video wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 426 the video and output it through at least one video output apparatus 214 located in the meeting space 130. In an example embodiment, the camera 158 is a built-in camera of the mobile apparatus 140A.
In an example embodiment, the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 420 the video wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130. This example embodiment may augment or even replace the use of general displays 214 in the meeting space 130 as each user 160B may see the video in his/her own mobile apparatus 140B, in a display 150 of the mobile apparatus 140B, for example.
In an example embodiment, the apparatus 100 is further caused to determine 122 the user meeting rights of the user 160, 160A, 160B associated with the mobile apparatus 140, 140A, 140B in a form of software configuration information 310, transmit 312 the software configuration information to the mobile apparatus 140, 140A, 140B, and receive 124 the service request from an application 148 configured with the software configuration information 310 in the mobile apparatus 140, 140A, 140B. With this example embodiment, the user 160, 160A, 160B may obtain beforehand a meeting application 148, which is then configured on-the-go with the software configuration information 310 to support various meeting spaces 130 as required. The meeting application 148 may be a so-called stub, i.e., basic software with a communication interface and a configurator, which enables flexible configuration of the meeting application 148. The meeting application 148 may become various functionalities (and their help functions) with the software configuration information 310. These functionalities may already exist in the mobile apparatus 140 and/or they may be downloaded from an external source such as the apparatus 100.
Finally, let us study Figure 2 illustrating further example embodiments of the apparatus 100 and its operating environment.
The attendants 160A, 160B of the meeting each have an attendant user interface 230A, 230B for interacting with the apparatus 100 and the meeting services provided by the apparatus 100. The chair/operator 220 of the meeting has a control user interface 226 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100, and also a microphone 222 coupled with a microphone amplifier 224. The technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130 has a technician's user interface 218 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100. The roles of the technician 216 and the chair/operator 220 may also be combined.
A data network 200 may be a wired network, utilizing copper or optical fibre, for example. The wired network 200 may even supply power (Power Over Ethernet, POE).
A wireless network 202 may be a wireless local area network and/or a mobile network.
A media network 204 may exist instead of or in addition to the wireless network 202 and it may be based on Bluetooth or some other wireless technology.
The location services 206 may utilize various location technologies, such as satellite-based location, cellular network location, indoor radio beacon- based location, indoor magnetic positioning, etc. The location services 206 may also utilize other electronic or optical mechanisms. The location services 206 may also utilize an electronic seating arrangement management system.
The apparatus 100 may be coupled to external cloud services 210, which may further enhance the services provided for the meeting in the meeting space 130.
A firewall 208 may be utilized to protect the apparatus 100 from unauthorized traffic from and to the Internet.
It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. The invention and its embodiments are not limited to the example embodiments described above but may vary within the scope of the claims.

Claims

Claims
1. An apparatus (100) comprising:
one or more processors (102); and
one or more memories (104) including computer program code (106), the one or more memories (104) and the computer program code
(106) configured to, with the one or more processors (102), cause the apparatus (100) at least to:
detect (120) an arrival of a user (160, 160A, 160B) to a meeting space
(130);
determine (122) user meeting rights of the user (160, 160A, 160B) associated with a mobile apparatus (140, 140A, 140B);
receive (124) a meeting service request from the mobile apparatus (140, 140A, 140B);
process (126) the received meeting service request according to the determined user meeting rights to control a media system (132) of the meeting space (130);
receive (400) a comment request as the service request (124);
transmit (402) the comment request to a moderator (220) of the meeting;
receive (404) a decision regarding the comment request from the moderator (220);
transmit (406) the decision regarding the comment request to the mobile apparatus (140A); and
if the decision regarding the comment request is affirmative (408 YES), transmit (410) a configuration request to the mobile apparatus (140A) to configure the mobile apparatus (140A) to catch speech of the user (160A) with a microphone (156) coupled with the mobile apparatus (140A) and transmit (412) the speech wirelessly to the media system (132) of the meeting space (130), and configure (414) the media system (132) of the meeting space (130) to receive (416) the speech and output it through at least one audio output apparatus (212) located in the meeting space (130).
2. The apparatus of claim 1, further caused to:
detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting a presence (302) of the user (160, 160A, 160B) in the meeting space (130).
3. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting a presence (304) of the mobile apparatus (140, 140A, 140B) in the meeting space (130).
4. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by analyzing a code (306) received from the mobile apparatus (140, 140A, 140B).
5. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting an interaction of the mobile apparatus (140, 140A, 140B) with a discovery protocol (308) broadcasted in the meeting space (130).
6. The apparatus of any preceding claim, further caused to: detect (314) an exit of the user (160, 160A, 160B) from the meeting space (130); and
transmit (316) a meeting service termination order to the mobile apparatus (140, 140A, 140B).
7. The apparatus of any preceding claim, further caused to: configure (414) the media system (132) of the meeting space (130) to broadcast (418) the speech wirelessly to a plurality of mobile apparatuses (140B) located in the meeting space (130).
8. The apparatus of any preceding claim, further caused to: transmit (422) a configuration request to the mobile apparatus (140A) to configure the mobile apparatus (140A) to catch video of the user (160A) with a camera (158) coupled with the mobile apparatus (140) and transmit (424) the video wirelessly to the media system (132) of the meeting space (130), and configure (414) the media system (132) of the meeting space (130) to receive (426) the video and output it through at least one video output apparatus (214) located in the meeting space (132).
9. The apparatus of claim 8, further caused to:
configure (414) the media system (132) of the meeting space (130) to broadcast (420) the video wirelessly to a plurality of mobile apparatuses (140B) located in the meeting space (130).
10. The apparatus of any preceding claim, further caused to: determine (122) the user meeting rights of the user (160, 160A, 160B) associated with the mobile apparatus (140, 140A, 140B) in a form of software configuration information (310);
transmit (312) the software configuration information to the mobile apparatus (140, 140A, 140B); and
receive (124) the service request from an application (148) configured with the software configuration information (310) in the mobile apparatus (140, 140A, 140B).
11. A computer-readable storage medium (112) comprising the computer program code (106) of any preceding claim 1 to 10, which, when loaded into the apparatus (100) causes the apparatus (100) to perform the described processing.
12. A method comprising:
detecting (120) an arrival of a user to a meeting space;
determining (122) user meeting rights of the user associated with a mobile apparatus;
receiving (124) a meeting service request from the mobile apparatus; processing (126) the received meeting service request according to the determined user meeting rights to control a media system of the meeting space;
receiving (400) a comment request as the service request; transmitting (402) the comment request to a moderator of the meeting; receiving (404) a decision regarding the comment request from the moderator;
transmitting (406) the decision regarding the comment request to the mobile apparatus; and
if the decision regarding the comment request is affirmative (408
YES), transmitting (410) a configuration request to the mobile apparatus to configure the mobile apparatus to catch speech of the user with a microphone coupled with the mobile apparatus, and transmitting (412) the speech wirelessly to the media system of the meeting space, and configuring (414) the media system of the meeting space to receive (416) the speech and output it through at least one audio output apparatus located in the meeting space.
PCT/FI2017/050878 2016-12-13 2017-12-12 Apparatus, computer program and method for controlling media system of meeting space WO2018109270A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20165958 2016-12-13
FI20165958 2016-12-13

Publications (1)

Publication Number Publication Date
WO2018109270A1 true WO2018109270A1 (en) 2018-06-21

Family

ID=60943044

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2017/050878 WO2018109270A1 (en) 2016-12-13 2017-12-12 Apparatus, computer program and method for controlling media system of meeting space

Country Status (1)

Country Link
WO (1) WO2018109270A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1085774A2 (en) * 1999-09-16 2001-03-21 AT&T Corp. H.323 Mobility architecture for terminal user and service mobility
US20140267559A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Smart Device Pairing and Configuration for Meeting Spaces

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1085774A2 (en) * 1999-09-16 2001-03-21 AT&T Corp. H.323 Mobility architecture for terminal user and service mobility
US20140267559A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Smart Device Pairing and Configuration for Meeting Spaces

Similar Documents

Publication Publication Date Title
CN105933899B (en) The cut-in method and device of wireless access point
US10055204B2 (en) Generating communication firmware and a program plug-in based on product information and a program template
EP3029889A1 (en) Method for instant messaging and device thereof
CN107018496B (en) Paging method and device
US9996164B2 (en) Systems and methods for recording custom gesture commands
EP3002911B1 (en) Communication message recognition method and device thereof
US11204681B2 (en) Program orchestration method and electronic device
KR101678038B1 (en) Method, apparatus, program, and recording medium for automatically connecting wireless network
US10216711B2 (en) Information collection method and apparatus
CN112947823A (en) Session processing method, device, equipment and storage medium
CN109644326A (en) Transmit the method and device of random access instruction information
CN107172477A (en) voting method and device
CN104837154B (en) The control method and device of wireless access points
EP2806622B1 (en) Displaying a group message
CN105095081B (en) The function test method and device of application program
CN108702602A (en) Share method, electronic equipment and the system of image
CN109156041A (en) Share the method and electronic equipment of image
CN107425991A (en) Establish the method, apparatus of group and the method, apparatus of message forwarding
CN110109608A (en) Text display method, device, terminal and storage medium
CN111554382A (en) Medical image processing method and device, electronic equipment and storage medium
CN107223351A (en) Control the method and device of network insertion
CN107220059A (en) The display methods and device of application interface
CN106537288B (en) The method and device of self-starting is applied in control
EP3328022A1 (en) Managing wireless network connection
US10278033B2 (en) Electronic device and method of providing message via electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17826260

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17826260

Country of ref document: EP

Kind code of ref document: A1