[go: up one dir, main page]

WO2020011694A1 - Determining light effects to be rendered simultaneously with a content item - Google Patents

Determining light effects to be rendered simultaneously with a content item Download PDF

Info

Publication number
WO2020011694A1
WO2020011694A1 PCT/EP2019/068208 EP2019068208W WO2020011694A1 WO 2020011694 A1 WO2020011694 A1 WO 2020011694A1 EP 2019068208 W EP2019068208 W EP 2019068208W WO 2020011694 A1 WO2020011694 A1 WO 2020011694A1
Authority
WO
WIPO (PCT)
Prior art keywords
profile
light
user
determining
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2019/068208
Other languages
French (fr)
Inventor
Shu JIAN
Weixi ZHOU
Bingzhou Chen
Dzmitry Viktorovich Aliakseyeu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Signify Holding BV
Original Assignee
Signify Holding BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding BV filed Critical Signify Holding BV
Publication of WO2020011694A1 publication Critical patent/WO2020011694A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the invention relates to a system for determining light effects.
  • the invention further relates to a method of determining light effects.
  • the invention also relates to a computer program product enabling a computer system to perform such a method.
  • Philips Hue is a consumer connected lighting solution.
  • the system consists of a central controller, named a bridge, wirelessly controllable light endpoints and user interfaces in various forms (switches, sensors and mobiles apps).
  • the bridge is connected to the router of the user and communicates to the light points. It is capable of running schedules and home automation rules. In this way it acts as the intelligence of the system. All user interfaces connect to the bridge in order to actuate the lights.
  • Controlling connected lights via a touchscreen of mobile device has become a very popular way of controlling connected lights.
  • smart speakers such as the Amazon Echo and the Google Home
  • this way of controlling connected lights is also gaining popularity.
  • Smart speakers cannot only be used to control connected lights; playing back music is one of the most popular applications of smart speakers.
  • US 2017/0265270 discloses controlling an operating parameter, e.g. color, of a light source as a function of operating data included in a retrieved operating data file.
  • the operating data comprises at least one lighting sequence and a time code data set coupled with the sequence(s).
  • the operating data file may be associated with music.
  • US 2017/0265270 further discloses taking in account possible preferences or needs of the end user.
  • the system comprises at least one processor, the system is configured to obtain an input representing a context, said input comprising one or more parameter values, determine an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determine a profile based on said determined extent, determine said one or more light effects based on said profile and said content item, and cause one or more light sources to render said one or more of light effects simultaneously with said content item.
  • the system may comprise one or more electronic devices.
  • a processor of a mobile device may be configured to obtain an input representing a context, said input comprising one or more parameter values, determine an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, and determine a profile based on said determined extent
  • a processor of a light device may be configured to determine said one or more light effects based on said profile and said content item
  • said content item of music may determine change of color and/or brightness of said one or more light effects
  • said profile may determine the range of color and/or brightness and the range of the level of dynamicity, and cause one or more light sources to render said one or more of light effects simultaneously with said content item.
  • the inventors have recognized that which light effects a user wants to be rendered simultaneously with a content item can be determined from context and that this context is usually not just one variable, but a combination of multiple variables.
  • the inventors have further recognized that the light effects should not or cannot be determined from the entire context. By determining and taking into account an extent to which the different input parameter values representing this context should be used to determine these light effects, the provided content experience can be customized in a flexible manner.
  • Said one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering said content item, for example.
  • Said profile may identify a dynamicity range to be used for said one or more light effects and/or a color range to be used for said one or more light effects, for example.
  • Said at least one processor may be configured to determine a current location of said user and determine said type of activity currently being carried out by said user based on said current location of said user.
  • the type of activity of the user e.g. relaxing, working, cooking, regularly impacts which light effects are appreciated the user.
  • Said at least one processor may be configured to allow a person (e.g. a user or administrator) to specify a degree of importance of each of said one or more parameter values and determine said extent based on said one or more degrees of importance. This gives control over and insight into the definition of a user’s preferences and allows the user’s preferences to be tuned to get the best content experience.
  • a person e.g. a user or administrator
  • Said at least one processor may be configured to allow a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles, determine said extent based on said plurality of input relations, and determine said profile further based on one or more of said plurality of predefined profiles. This provides an intuitive user interface for indicating priorities of different preferences.
  • Said at least one processor may be configured to determine a further profile based on said determined extent, determine one or more further light effects based on said further profile, and cause one or more further light sources to render said further light effects simultaneously with said content item. This allows multiple (groups of) light sources to be controlled differently (i.e. render different light effects) based on the same determined extent.
  • Said at least one processor may be configured to allow a person to specify a light relation between said one or more light sources and each of a plurality of predefined profiles, allow a person to specify a further light relation between said one or more further light sources and each of said plurality of predefined profiles, determine said profile further based on said light relation and one or more of said plurality of predefined profiles, and determine said further profile further based on said further light relation and one or more of said plurality of predefined profiles.
  • the predefined profiles do not comprise effect ranges per light source, but only effect ranges for all light sources (e.g. a color range for ah light sources)
  • this intuitive user interface allows a person to specify a map indicating how a general profile for ah light sources should result in a profile per light source, i.e. be distributed over the individual light sources.
  • the method comprises obtaining an input representing a context, said input comprising one or more parameter values, determining an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determining a profile based on said determined extent, determining said one or more light effects based on said profile and said content item, and causing one or more light sources to render said one or more of light effects simultaneously with said content item.
  • Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
  • Said one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering said content item.
  • Said profile may identify a dynamicity range to be used for said one or more light effects and/or a color range to be used for said one or more light effects.
  • Said method may further comprise determining a current location of said user and determining said type of activity currently being carried out by said user based on said current location of said user.
  • Said method may further comprise allowing a person to specify a degree of importance of each of said one or more parameter values, wherein said extent is determined based on said one or more degrees of importance.
  • Said method may further comprise allowing a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles, wherein said extent is determined based on said plurality of input relations and said profile is further determined based on one or more of said plurality of predefined profiles.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • a non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: obtaining an input representing a context, said input comprising one or more parameter values, determining an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determining a profile based on said determined extent, determining said one or more light effects based on said profile and said content item, and causing one or more light sources to render said one or more of light effects simultaneously with said content item.
  • aspects of the present invention may be embodied as a device, a method or a computer program product.
  • aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro- code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module” or “system.”
  • Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer.
  • aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a processor in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • Fig. 1 shows an example of an environment in which the invention may be used
  • Fig. 2 is a block diagram of an embodiment of the system of the invention.
  • Fig. 3 is a flow diagram of a first embodiment of the method of the invention.
  • Fig. 4 is a flow diagram of a second embodiment of the method of the invention.
  • Fig. 5 shows a first part of an example of a user interface for defining an effect range in a predefined profile
  • Fig. 6 shows a second part of the example of the user interface of Fig.5;
  • Fig. 7 shows an example of input relations between potential parameter values and predefined profiles;
  • Fig. 8 shows an example of a basic distribution map for determining a profile per light
  • Fig. 9 shows an example of input relations between actual parameter values and predefined profiles
  • Fig. 10 shows an example of a current distribution map for determining a profile per light
  • Fig. 11 is a flow diagram of a third embodiment of the method of the invention.
  • Fig. 12 is a block diagram of an exemplary data processing system for performing the method of the invention.
  • Fig ⁇ ! shows a first example of an environment in which the invention may be used: a home 11 with a hall 13, a kitchen 14 and a living room 15.
  • the kitchen 14 comprises a light 29.
  • the living room 15 comprises two lights: a light 27 to the left of a television and a light 28 to the right of the television.
  • a person 19 is standing in the living room 15 holding mobile device 1.
  • a bridge 25, e.g. a Philips Hue bridge, is connected to a wireless LAN access point 17, e.g. via Ethernet.
  • the bridge 25 communicates with the lights 27-29 wirelessly, e.g. using Zigbee technology.
  • the lights 27-29 may be Philips Hue lights, for example.
  • a smart speaker 21 is present as well.
  • the smart speaker 21 and the mobile device 1 are wirelessly connected to the wireless LAN access point 17 as well, e,g, via Wi-Fi (IEEE 802.11).
  • the mobile device 1 is able to control lights 27-29 via the wireless LAN access point 17 and the bridge 25.
  • the smart speaker 21 is able to control the lights 27-29 by transmitting signals to the bridge 25.
  • the mobile device 1 and the smart speaker 21 are able to control lights 27-29 without the use of a bridge.
  • the mobile device 1 comprises a receiver 3, a processor 5, memory 7 and a display 9, see Fig.2.
  • the processor 5 is configured to obtain an input representing a context.
  • the input comprises one or more parameter values.
  • the processor 5 is further configured to determine an extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item.
  • the processor 5 is further configured to determine a profile based on the determined extent, determine the one or more light effects based on the profile and the content item, and cause one or more light sources to render the one or more of light effects simultaneously with the content item.
  • the one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering the content item, for example.
  • the processor 5 may be configured to determine a current location of the user and determine the type of activity currently being carried out by the user based on the current location of the user (e.g. a user sitting on a sofa is more likely to be relaxing than a user sitting at a desk).
  • the current location may be determined by analyzing echoes, for example.
  • the mobile device 1 can play a high frequency sound that cannot be perceived by humans, a microphone of the mobile device 1 can then be used to analyze the echo spectrum footprint of the sound, and the spectrum footprint can then be mapped to a location based on previous measurements.
  • the user first performs a calibration to obtain these previous measurements in his home and in the locations of the different places he might visit. This calibration can be either done with a specific calibration function or may be performed automatically while using other functions of the mobile device 1 or of the light control application running on the mobile device 1.
  • the current location may be determined using beacons, e.g. Bluetooth beacons.
  • the current location may be combined with other information from the mobile device 1 or from a connected wearable device, e.g. information whether the user is standing or exercising based on information from an accelerometer. This helps determine (estimate) the type of activity currently being carried out. As a first example, more dynamic light effects with a higher energizing effect may be rendered when the user is exercising. As a second example, if the user is sitting at a desk, he is more likely to be working and may need a more static light effect.
  • the fullness of a user’s schedule is often a good indication of how tired the user is and/or how much energy the user has.
  • the user’s schedule may be obtained by synchronizing with an online agenda, e.g. Google Calendar or Microsoft Exchange Server.
  • the system may assign different values to accepted and tentative(pending) appointments in the agenda and use the number of hours that have elapsed since an appointment to weigh these values and calculate the fullness value of the day. Relaxing colors (warm white) may be beneficial when a user has had a full day.
  • the urgency of pending tasks is often a good indication of whether a user needs to be energized.
  • Information on pending tasks may be obtained by synchronizing with a project management tool, for example.
  • a value to may be given to each pending task based on how urgent the task is (using the difference between the day and time of the task’s deadline and the current day and time) and the energy need may be calculated as the sum of these values.
  • Energized colors may be beneficial if the user has many urgent tasks and/or very urgent tasks.
  • Type of activity the user may appreciate more interaction between light and music, i.e. more dynamicity, when having a party, while he may prefer slower, subtle light changes when relaxing.
  • a first user may really like the color blue, because he thinks that makes the room look elegant while another user might dislike this color, because he likes to have a warm and cozy home instead.
  • Type of music for example, for rock songs with fast beats, people might want more dynamic light effects with large color differences to create a lively scene, but for light music, people might want more static light effects to create a simple natural scene like forest, sea or sky in order to relax.
  • the user may be able to specify manually the extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, it is beneficial to do this at least partly automatically by learning from the user’s behavior and/or by allowing re-use of e.g. configurations and algorithms of other users and developers. As a result, the performance of the system increases over time.
  • the mobile device 1 comprises one processor 5.
  • the mobile device 1 comprises multiple processors.
  • the processor 5 of the mobile device 1 may be a general-purpose processor, e.g. from ARM or Qualcomm, or an application-specific processor.
  • the processor 5 of the mobile device 1 may run an iOS, Windows or Android operating system for example.
  • the transceiver 3 may use one or more wireless communication technologies to communicate with the wireless LAN access point 17, for example.
  • multiple transceivers are used instead of a single transceiver.
  • a receiver and a transmitter have been combined into a transceiver 3.
  • one or more separate receiver components and one or more separate transmitter components are used.
  • the memory 7 may comprise one or more memory units.
  • the memory 7 may comprise solid state memory, for example.
  • the memory 7 may be used to store apps and data, for example.
  • the display 9 may comprise a LCD or OLED display panel, for example.
  • the display 9 may be a touch screen, for example.
  • the processor 5 may use this touch screen to provide a user interface, for example.
  • the mobile device 1 may comprise other components typical for a mobile device such as a battery, a speaker, a microphone and a power connector.
  • the invention may be implemented using a computer program running on one or more processors.
  • the system of the invention is a mobile device.
  • the system of the invention is another type of device, e.g. a bridge, a lighting device, an Internet server or a smart speaker/controller, or comprises a plurality of electronic devices.
  • a first embodiment of the method of the invention is shown in Fig.3.
  • a step 101 comprises obtaining an input representing a context.
  • the input comprises one or more parameter values.
  • a step 103 comprises determining an extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, i.e. determining a context effect.
  • the extent to which the input should be used to determine a profile may be an input or selection of percentage or weight of one or more parameter values by the user. For example, the user may input a 40% for the user’s light effect preference, and a 30% for the type of activity currently being carried out, and a 30% for the current location of the user. A random item may be provided as well. For example, the user may select a 40% for the type of music currently being rendered, a 30% for the fullness of a user’s schedule, and the rest 30% for the random, which can be determined by the processor randomly.
  • a user may specify a single weight for all types of activity that might be carried out, a single weight for all different locations the user might be in, a single weight for all types of music that might be rendered and/or a single weight for all degrees of fullness of a user’s agenda.
  • the determined one or more light effects may still depend on exactly which type of activity is currently being carried out, which location the user is currently in, which type of music is currently being rendered, and/or how full the user’s schedule currently is.
  • jazz music may be associated with a different light effect range than pop music.
  • a step 105 comprises determining a profile based on the determined extent, i.e. the context effect.
  • a step 107 comprises determining the one or more light effects based on the profile and the content item.
  • Step 107 comprises analyzing the content item.
  • step 107 comprises receiving features extracted from the content item from another device.
  • a step 109 comprises causing one or more light sources to render the one or more of light effects simultaneously with the content item by transmitting one or more light commands.
  • step 101 comprises a sub step 111 of determining a current location of the user and a sub step 113 of determining the type of activity currently being carried out by the user based on the current location of the user.
  • the type of activity currently being carried out by the user is not determined or determined in a different manner.
  • a further step 115 comprises allowing a person to specify a degree of importance of each of the one or more parameter values, i.e. to specify priorities, and step 103 comprises a sub step 117 of determining the extent based on the one or more degrees of importance.
  • these one or more degrees of importance are not specified by a person, but learned from user behavior, for example.
  • Step 120 comprises creating a plurality of predefined profiles.
  • the names are assigned in step 120 (e.g.“Effect Range 1” and“Effect Range 2”) and the content of the predefined profiles is defined in step 129.
  • Figs.5 and 6 show an example of a user interface for defining predefined profiles in step 129, and in particular for defining an effect range of a predefined profile for three lights that have been identified previously, e.g. lights 27-29 of Fig.1.
  • a first part of the example of the user interface for defining an effect range is shown in Fig.5.
  • Screen 51 allows the user (or an administrator) to define a color range per light by placing a shape in a color circle 53.
  • the user or the administrator
  • each light is associated with a single shape, but in an alternative embodiment, a light can be associated with multiple shapes.
  • circle 55 is associated with light 27 of Fig.1 and identifier“1” and covers green and yellow tones
  • circle 57 is associated with light 28 of Fig.1 and identifier“2” and covers red and purple tones
  • circle 59 is associated with light 29 of Fig.l and identifier“3” and covers blue tones.
  • FIG.6 A second part of the example of the user interface for defining an effect range is shown in Fig.6.
  • Screen 61 allows the user (or the administrator) to define a dynamic range per light by placing a rectangle on a dynamic range bar 63.
  • the user or the administrator can specify the height of the rectangle and the location of the rectangle, e.g. by touching two positions on the bar 63 using a touch sensing portion of the display 9.
  • each light is associated with a single rectangle, but in an alternative embodiment, a light can be associated with multiple rectangles.
  • rectangle 65 is associated with light 27 of Fig.l and identifier“1” and specifies the use of very dynamic light effects
  • rectangle 67 is associated with light 28 of Fig.l and identifier“2” and specifies the use of light effects that are not too dynamic and not too static
  • rectangle 69 is associated with light 29 of Fig.l and identifier“3” and specifies the use of light effects that are only slightly dynamic, but not completely static.
  • the effect range is defined per light.
  • an effect range may be defined for a group of lights, e.g. for all the lights in a room or building.
  • Step 115 of Fig.3 comprises a sub step 123 in Fig.4.
  • Step 123 comprises allowing a person to specify input relations between potential parameter values and two predefined profiles ranges defined with the user interface of Figs.5 and 6.
  • An example of a user interface for specifying these input relations is shown in Fig.7.
  • the two predefined profiles each comprise the effect ranges for all three lights.
  • One such predefined profile has been defined in the example of Figs.5 and 6.
  • the first predefined profile is represented by effect range circle 73 in Fig.7.
  • the second predefined profile is represented by effect range circle 74 in Fig.7.
  • a parameter value circle 75 represents a parameter value indicating a full agenda
  • a parameter value circle 76 represents a parameter value indicating that jazz music is currently being played
  • a parameter value circle 77 represents a parameter value indicating an empty agenda
  • a parameter value 79 represents a parameter value indicating that pop music is currently being played.
  • a parameter value circle 78 represents the user’s preferences.
  • the user interface of Fig.7 is also used to define the extent to which the user’s explicit preferences are taken into account. For example, the user may specify a dislike for purple colors as user preference, but this preference is not always met: the extent to which the user’s preferences are taken into account is specified with the user interface of Fig.7. If the user wants to make sure that no purple colors are used, he is able to alter the color effect ranges using the user interface of Fig.5.
  • Fig.7 depicts circles in a first coordinate system.
  • the size and location of the circles do not say anything about the content of the parameter values or of the predefined profiles, but are only used to determine the above- mentioned overlap.
  • the circles may be stored in a memory as tuples in which each tuple represents a vector from the origin of the first coordinate system to the center of the circle and a radius or diameter of the circle.
  • other shapes are used instead of or in addition to circles.
  • the location and size of the shapes are determined fully or partly automatic.
  • Step 121 shown in Fig.4 is new compared to the embodiment of Fig.3.
  • Step 121 comprises allowing a person to specify light relations between the lights 25-27 of Fig.l (identified by numbers 1-3) and the predefined profiles of Fig.7. These light relations can be represented by a basic distribution map, which is used later (in step 131).
  • An example of a user interface for specifying this basic distribution map is shown in Fig.8.
  • the first predefined profile of Fig.7 is represented by effect range rectangles 82 and 83.
  • the second predefined profile of Fig.7 is represented by effect range rectangle 84.
  • the lights 25-27 of Fig.l are represented by circles 86-88, respectively.
  • Fig.8 depicts a second coordinate system.
  • the size and location of the rectangles do not say anything about the content of the predefined profiles and the size and locations of the circles do not say anything about the physical size and geographical location of the lights.
  • other shapes are used instead of or in addition to rectangles and circles.
  • the location and size of the shapes are determined fully or partly automatic.
  • the effect ranges to be used for that light can be determined.
  • the distribution map that is used to determine this coverage is not the basic distribution map defined in Fig.8, but a distribution map derived from this basic distribution map in step 131. This derived distribution map is derived using from the basic distribution map using the input relations specified in Fig.7 and the input.
  • Step 125 shown in Fig.4 is also new compared to the embodiment of Fig.3.
  • Step 125 comprises a person selecting one or more of the parameter values represented in Fig.7 and showing a corresponding derived distribution map (derived using the selected parameter values as simulated input). If the user (or administrator) does not like the result, step 123 and/or step 121 may be repeated.
  • Step 103 of Fig.3 comprises a sub step 127 in the embodiment of Fig.4.
  • Step 127 comprises determining the context effect based on the defined priorities (i.e. the defined input relations) and the context (input).
  • the extent to which the input should be used to determine the profile is determined in this step 127.
  • the overlap between the effect range circles 73 and 74 and the parameter values present in the input is determined.
  • parameter value circle 76 represents a parameter value which indicates that jazz music is being played
  • parameter value circle 78 represents a parameter value identifying the user’s preferences.
  • the overlap area 92 between effect range circle 73 and parameter value circle 76 is determined as a percentage of the circle 73 and the overlap area 93 between effect range circle 74 and parameter value circle 78 is determined as a percentage of the circle 74.
  • the overlap area 92 is 30% of the circle 73 and the overlap area 93 is 5% of the circle 74.
  • the basic distribution map is not used in step 127, but used in the next step: step 131.
  • one or more profiles are determined for the light sources to be controlled. In the embodiment of Fig.4, these profiles are derived from the predefined profiles.
  • a profile used in step 105 may be a selection of one of these predefined profiles or may be a combination of one or more of these predefined profiles.
  • step 105 ofFig.3 comprises sub steps 131,133, 135 and 137.
  • step 131 a current distribution map is derived from the basic distribution map defined in step 121 using the sizes of the overlap areas determined in step 127, which represent the context effect. This is illustrated with the help of Fig.10. A rectangle representing a predefined profile is enlarged or depending on the amount of overlap
  • step 127 determines for the predefined profile in step 127. This is done for each rectangle.
  • a rectangle is enlarged if the amount of overlap is more than 15% and if the overlap is less than 15%.
  • Step 133 comprises determining a profile for light 27 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129.
  • Light 27 is represented as circle 86 in Fig.10. Since circle 86 is only covered by effect range rectangle 96 corresponding to the first predefined profile, the profile for light 27 is only determined based on this first predefined profile and is equal to this first predefined profile.
  • Step 135 comprises determining a profile for light 28 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129.
  • Light 28 is represented as circle 87 in Fig.10. Since circle 87 is covered by both effect range rectangle 96 corresponding to the first predefined profile and the effect range rectangle 97 corresponding to the second predefined profile, the profile for light 28 is determined based on a combination of the first predefined profile and the second predefined profile. There are several manners in which this combination may be made.
  • the determined profile may comprise the union of the color ranges represented by circles 55 and 57 of Fig.5 and the union of the dynamic ranges represented by rectangles 65 and 67 of Fig.6.
  • Step 137 comprises determining a profile for light 29 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129.
  • Light 29 is represented as circle 88 in Fig.10. Since circle 88 is not covered by any of the effect rectangles, the profile for light 29 indicates that it should not be used for determining light effects in step 107.
  • Step 107 comprises analyzing the content item.
  • step 107 comprises receiving features extracted from the content item from another device.
  • Step 107 further comprises determining light effects for light 27 based on the profile determined for light 27 in step 133 and this analysis, determining light effects for light 28 based on the profile determined for light 28 in step 135 and this analysis, and determining light effects for light 29 based on the profile determined for light 29 in step 137 and this analysis.
  • the chromaticity setting and/or the brightness setting of a light may change each beat or each section of a song or may follow the rhythm of the song.
  • the lights effects are not only determined based on a song or an audio portion of a music video, but also on a video portion of a music video. For example, the light effects could be made to mimic the light effects people see in the music video to help them get a more immersive experience.
  • step 109 comprises causing light 27 of Fig.l to render the light effects determined for light 27 in step 107, causing light 28 of Fig.l to render the light effects determined for light 28 in step 107, and causing light 29 of Fig.l to render the light effects determined for light 29 in step 107 by transmitting light commands.
  • the light commands are timed such that the light effects are rendered simultaneously with the content item, e.g. using current song position information obtained from a music player or music service.
  • FIG.l 1 A third embodiment of the method of the invention is shown in Fig.l 1.
  • This third embodiment is somewhat simpler than the second embodiment of Fig.4.
  • no distribution map is used and the profile determined for each light is based on, e.g. equal to, exactly one predefined profile. For this reason, steps 121, 125 and 131 of Fig.4 are not present in the third embodiment of Fig.l 1.
  • predefined profile the profiles for the individual lights are based is still selected based on the priorities (input relations) defined in step 123.
  • the overlap areas 92 and 93 of Fig.9 are determined as described in relation to the second embodiment, but they are used in a different way. A choice is made between the two predefined profiles based on the determined overlap. The predefined profile with the largest overlap area is chosen. In the example of Fig.9, the overlap area 92 is 30% and the overlap area 93 is 5% and therefore the first predefined profile corresponding to the effect range circle 73 is chosen.
  • Steps 163, 165 and 167 comprise determine profiles for the three lights. Each profile is determined based on the same predefined profile, but a predefined profile may specify different settings for different lights.
  • Step 163 comprises determining a profile for light 27 of Fig.1 based on the effect ranges defined for light 27 in the first predefined profile.
  • Step 165 comprises determining a profile for light 28 of Fig.l based on the effect ranges defined for light 28 in the first predefined profile.
  • Step 167 comprises determining a profile for light 29 of Fig.l based on the effect ranges defined for light 29 in the first predefined profile.
  • steps 107 and 109 are performed in the manner described in relation to the second embodiment.
  • a user’s color preferences can be taken into account in at least two ways: the user adapts the color ranges in the predefined profiles or the user specifies his color preferences as context. If the user specifies his color preferences as context, they may not (always) be met. The user may use voice commands to indicate his preferences, for instance. As a first example, the user may say“a bit more reddish”. This user preference (like) is taken into account to a certain extent, i.e. there is a higher chance of more reddish (warm) colors being used compared to the default setting.
  • the user may say“I do not like purple”. This user preference (dislike) is taken into account to a certain extent, i.e. there is a lower chance that the color purple or a color close to the color purple is used.
  • the input relations shown in Fig ⁇ 7 may be, for example, automatically adjusted to decrease the chance that a predefined profile is used that has the color purple in its color range.
  • determining the extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item may simply comprising adjusting the probabilities that certain colors or certain levels of dynamicity and/or brightness are used.
  • color ranges and input relations do not need to be defined.
  • a system may use an initial set of probabilities indicating a probability for each chromaticity setting, dynamicity setting and brightness setting that is used and randomly choose settings using these probabilities. If a user dislikes the color purple, the system may decrease the probability of using purple by 90% and the probability of using a color close to purple (e.g. in CIE color space, a distance of 0.01 or less) by 80%.
  • User preferences may also be determined in a different manner.
  • a wearable device may measure a biological response to rendered light effects and determine preferences based on this response, for example. For instance, if a light is flashing too much (too dynamic) and this results in a change in blood pressure or even in a change in heart rate, this could be an indicator that someone strongly dislikes this light effect.
  • Fig.12 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Figs. 3, 4 and 11.
  • the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.
  • the memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
  • the processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
  • I/O devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system.
  • input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like.
  • output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig.12 with a dashed line surrounding the input device 312 and the output device 314).
  • a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • a network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks.
  • the network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks.
  • Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
  • the memory elements 304 may store an application 318.
  • the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices.
  • the data processing system 300 may further execute an operating system (not shown in Fig.12) that can facilitate execution of the application 318.
  • the application 318 being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 302 described herein.

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method comprises obtaining (101) an input representing a context. The input comprises one or more parameter values. The method further comprises determining (103) an extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, i.e. determining the effect of the context. The method further comprises determining (105) a profile based on the determined extent, determining (107) the one or more light effects based on the profile and the content item, and causing (109) one or more light sources to render the one or more of light effects simultaneously with the content item.

Description

Determining light effects to be rendered simultaneously with a content item
FIELD OF THE INVENTION
The invention relates to a system for determining light effects.
The invention further relates to a method of determining light effects.
The invention also relates to a computer program product enabling a computer system to perform such a method.
BACKGROUND OF THE INVENTION
Philips Hue is a consumer connected lighting solution. The system consists of a central controller, named a bridge, wirelessly controllable light endpoints and user interfaces in various forms (switches, sensors and mobiles apps). The bridge is connected to the router of the user and communicates to the light points. It is capable of running schedules and home automation rules. In this way it acts as the intelligence of the system. All user interfaces connect to the bridge in order to actuate the lights.
Controlling connected lights via a touchscreen of mobile device has become a very popular way of controlling connected lights. With the introduction of smart speakers such as the Amazon Echo and the Google Home and the introduction of the possibility of using the digital assistant of these smart speakers to control connected lights (via speech recognition), this way of controlling connected lights is also gaining popularity. Smart speakers cannot only be used to control connected lights; playing back music is one of the most popular applications of smart speakers.
The combination of light control and music playback is known. For example, US 2017/0265270 discloses controlling an operating parameter, e.g. color, of a light source as a function of operating data included in a retrieved operating data file. The operating data comprises at least one lighting sequence and a time code data set coupled with the sequence(s). The operating data file may be associated with music. US 2017/0265270 further discloses taking in account possible preferences or needs of the end user.
It is a drawback of the method disclosed in US 2017/0265270 that the provided content experience only takes into account preferences and/or needs of the end user in a limited manner. The rendered light effects need to stay close to what has been specified in the operating data file and only certain, static needs and preferences, such as mesopic/scotopic sensitivity of the human eyes, are taken into account.
SUMMARY OF THE INVENTION
It is a first object of the invention to provide a system for determining light effects, which can be used to customize a content experience with lighting in a flexible manner.
It is a second object of the invention to provide a method of determining light effects, which can be used to customize a content experience with lighting in a flexible manner.
In a first aspect of the invention, the system comprises at least one processor, the system is configured to obtain an input representing a context, said input comprising one or more parameter values, determine an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determine a profile based on said determined extent, determine said one or more light effects based on said profile and said content item, and cause one or more light sources to render said one or more of light effects simultaneously with said content item.
The system may comprise one or more electronic devices. For example, a processor of a mobile device may be configured to obtain an input representing a context, said input comprising one or more parameter values, determine an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, and determine a profile based on said determined extent, and a processor of a light device may be configured to determine said one or more light effects based on said profile and said content item, said content item of music may determine change of color and/or brightness of said one or more light effects, and said profile may determine the range of color and/or brightness and the range of the level of dynamicity, and cause one or more light sources to render said one or more of light effects simultaneously with said content item.
The inventors have recognized that which light effects a user wants to be rendered simultaneously with a content item can be determined from context and that this context is usually not just one variable, but a combination of multiple variables. The inventors have further recognized that the light effects should not or cannot be determined from the entire context. By determining and taking into account an extent to which the different input parameter values representing this context should be used to determine these light effects, the provided content experience can be customized in a flexible manner.
Said one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering said content item, for example. Said profile may identify a dynamicity range to be used for said one or more light effects and/or a color range to be used for said one or more light effects, for example.
Said at least one processor may be configured to determine a current location of said user and determine said type of activity currently being carried out by said user based on said current location of said user. The type of activity of the user, e.g. relaxing, working, cooking, regularly impacts which light effects are appreciated the user.
Said at least one processor may be configured to allow a person (e.g. a user or administrator) to specify a degree of importance of each of said one or more parameter values and determine said extent based on said one or more degrees of importance. This gives control over and insight into the definition of a user’s preferences and allows the user’s preferences to be tuned to get the best content experience.
Said at least one processor may be configured to allow a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles, determine said extent based on said plurality of input relations, and determine said profile further based on one or more of said plurality of predefined profiles. This provides an intuitive user interface for indicating priorities of different preferences.
Said at least one processor may be configured to determine a further profile based on said determined extent, determine one or more further light effects based on said further profile, and cause one or more further light sources to render said further light effects simultaneously with said content item. This allows multiple (groups of) light sources to be controlled differently (i.e. render different light effects) based on the same determined extent.
Said at least one processor may be configured to allow a person to specify a light relation between said one or more light sources and each of a plurality of predefined profiles, allow a person to specify a further light relation between said one or more further light sources and each of said plurality of predefined profiles, determine said profile further based on said light relation and one or more of said plurality of predefined profiles, and determine said further profile further based on said further light relation and one or more of said plurality of predefined profiles. When the predefined profiles do not comprise effect ranges per light source, but only effect ranges for all light sources (e.g. a color range for ah light sources), then this intuitive user interface allows a person to specify a map indicating how a general profile for ah light sources should result in a profile per light source, i.e. be distributed over the individual light sources.
In a second aspect of the invention, the method comprises obtaining an input representing a context, said input comprising one or more parameter values, determining an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determining a profile based on said determined extent, determining said one or more light effects based on said profile and said content item, and causing one or more light sources to render said one or more of light effects simultaneously with said content item. Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
Said one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering said content item. Said profile may identify a dynamicity range to be used for said one or more light effects and/or a color range to be used for said one or more light effects. Said method may further comprise determining a current location of said user and determining said type of activity currently being carried out by said user based on said current location of said user.
Said method may further comprise allowing a person to specify a degree of importance of each of said one or more parameter values, wherein said extent is determined based on said one or more degrees of importance. Said method may further comprise allowing a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles, wherein said extent is determined based on said plurality of input relations and said profile is further determined based on one or more of said plurality of predefined profiles.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: obtaining an input representing a context, said input comprising one or more parameter values, determining an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, determining a profile based on said determined extent, determining said one or more light effects based on said profile and said content item, and causing one or more light sources to render said one or more of light effects simultaneously with said content item.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product.
Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro- code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system." Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any
combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Fig. 1 shows an example of an environment in which the invention may be used;
Fig. 2 is a block diagram of an embodiment of the system of the invention;
Fig. 3 is a flow diagram of a first embodiment of the method of the invention;
Fig. 4 is a flow diagram of a second embodiment of the method of the invention;
Fig. 5 shows a first part of an example of a user interface for defining an effect range in a predefined profile;
Fig. 6 shows a second part of the example of the user interface of Fig.5; Fig. 7 shows an example of input relations between potential parameter values and predefined profiles;
Fig. 8 shows an example of a basic distribution map for determining a profile per light;
Fig. 9 shows an example of input relations between actual parameter values and predefined profiles;
Fig. 10 shows an example of a current distribution map for determining a profile per light;
Fig. 11 is a flow diagram of a third embodiment of the method of the invention; and
Fig. 12 is a block diagram of an exemplary data processing system for performing the method of the invention.
Corresponding elements in the drawings are denoted by the same reference numeral.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Fig·! shows a first example of an environment in which the invention may be used: a home 11 with a hall 13, a kitchen 14 and a living room 15. The kitchen 14 comprises a light 29. The living room 15 comprises two lights: a light 27 to the left of a television and a light 28 to the right of the television. A person 19 is standing in the living room 15 holding mobile device 1. A bridge 25, e.g. a Philips Hue bridge, is connected to a wireless LAN access point 17, e.g. via Ethernet. The bridge 25 communicates with the lights 27-29 wirelessly, e.g. using Zigbee technology. The lights 27-29 may be Philips Hue lights, for example.
A smart speaker 21 is present as well. The smart speaker 21 and the mobile device 1 are wirelessly connected to the wireless LAN access point 17 as well, e,g, via Wi-Fi (IEEE 802.11). The mobile device 1 is able to control lights 27-29 via the wireless LAN access point 17 and the bridge 25. The smart speaker 21 is able to control the lights 27-29 by transmitting signals to the bridge 25. In an alternative embodiment, the mobile device 1 and the smart speaker 21 are able to control lights 27-29 without the use of a bridge.
The mobile device 1 comprises a receiver 3, a processor 5, memory 7 and a display 9, see Fig.2. The processor 5 is configured to obtain an input representing a context. The input comprises one or more parameter values. The processor 5 is further configured to determine an extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item. The processor 5 is further configured to determine a profile based on the determined extent, determine the one or more light effects based on the profile and the content item, and cause one or more light sources to render the one or more of light effects simultaneously with the content item.
The one or more parameter values may indicate a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and/or a current setting for rendering the content item, for example. The processor 5 may be configured to determine a current location of the user and determine the type of activity currently being carried out by the user based on the current location of the user (e.g. a user sitting on a sofa is more likely to be relaxing than a user sitting at a desk).
The current location may be determined by analyzing echoes, for example. For instance, the mobile device 1 can play a high frequency sound that cannot be perceived by humans, a microphone of the mobile device 1 can then be used to analyze the echo spectrum footprint of the sound, and the spectrum footprint can then be mapped to a location based on previous measurements. The user first performs a calibration to obtain these previous measurements in his home and in the locations of the different places he might visit. This calibration can be either done with a specific calibration function or may be performed automatically while using other functions of the mobile device 1 or of the light control application running on the mobile device 1. Alternatively, the current location may be determined using beacons, e.g. Bluetooth beacons.
The current location may be combined with other information from the mobile device 1 or from a connected wearable device, e.g. information whether the user is standing or exercising based on information from an accelerometer. This helps determine (estimate) the type of activity currently being carried out. As a first example, more dynamic light effects with a higher energizing effect may be rendered when the user is exercising. As a second example, if the user is sitting at a desk, he is more likely to be working and may need a more static light effect.
The fullness of a user’s schedule is often a good indication of how tired the user is and/or how much energy the user has. The user’s schedule may be obtained by synchronizing with an online agenda, e.g. Google Calendar or Microsoft Exchange Server. The system may assign different values to accepted and tentative(pending) appointments in the agenda and use the number of hours that have elapsed since an appointment to weigh these values and calculate the fullness value of the day. Relaxing colors (warm white) may be beneficial when a user has had a full day.
The urgency of pending tasks is often a good indication of whether a user needs to be energized. Information on pending tasks may be obtained by synchronizing with a project management tool, for example. A value to may be given to each pending task based on how urgent the task is (using the difference between the day and time of the task’s deadline and the current day and time) and the energy need may be calculated as the sum of these values. Energized colors (cool white) may be beneficial if the user has many urgent tasks and/or very urgent tasks.
The following are examples of scenarios in which it is beneficial to determine light effects for a music entertainment experience from context:
Type of activity: the user may appreciate more interaction between light and music, i.e. more dynamicity, when having a party, while he may prefer slower, subtle light changes when relaxing.
Personal preference: for example, a first user may really like the color blue, because he thinks that makes the room look elegant while another user might dislike this color, because he likes to have a warm and cozy home instead.
Type of music: for example, for rock songs with fast beats, people might want more dynamic light effects with large color differences to create a lively scene, but for light music, people might want more static light effects to create a simple natural scene like forest, sea or sky in order to relax.
Not all lights need to render the same light effects. It may be beneficial to use different light effects for lights that are often in the user’s line of sight than for other lights. For example, a ceiling light is often perceived as functional lighting and people do not like it when it changes color too much. On the other hand, cove lighting is often deemed to create ambience at home and people like it if it has different colors and creates different scenes.
Although the user (or an administrator) may be able to specify manually the extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, it is beneficial to do this at least partly automatically by learning from the user’s behavior and/or by allowing re-use of e.g. configurations and algorithms of other users and developers. As a result, the performance of the system increases over time.
In the embodiment shown in Fig.2, the mobile device 1 comprises one processor 5. In an alternative embodiment, the mobile device 1 comprises multiple processors. The processor 5 of the mobile device 1 may be a general-purpose processor, e.g. from ARM or Qualcomm, or an application-specific processor. The processor 5 of the mobile device 1 may run an iOS, Windows or Android operating system for example.
The transceiver 3 may use one or more wireless communication technologies to communicate with the wireless LAN access point 17, for example. In an alternative embodiment, multiple transceivers are used instead of a single transceiver. In the
embodiment shown in Fig.2, a receiver and a transmitter have been combined into a transceiver 3. In an alternative embodiment, one or more separate receiver components and one or more separate transmitter components are used.
The memory 7 may comprise one or more memory units. The memory 7 may comprise solid state memory, for example. The memory 7 may be used to store apps and data, for example. The display 9 may comprise a LCD or OLED display panel, for example. The display 9 may be a touch screen, for example. The processor 5 may use this touch screen to provide a user interface, for example. The mobile device 1 may comprise other components typical for a mobile device such as a battery, a speaker, a microphone and a power connector. The invention may be implemented using a computer program running on one or more processors.
In the embodiment of Figs.l and 2, the system of the invention is a mobile device. In an alternative embodiment, the system of the invention is another type of device, e.g. a bridge, a lighting device, an Internet server or a smart speaker/controller, or comprises a plurality of electronic devices.
A first embodiment of the method of the invention is shown in Fig.3. A step 101 comprises obtaining an input representing a context. The input comprises one or more parameter values. A step 103 comprises determining an extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item, i.e. determining a context effect.
The extent to which the input should be used to determine a profile may be an input or selection of percentage or weight of one or more parameter values by the user. For example, the user may input a 40% for the user’s light effect preference, and a 30% for the type of activity currently being carried out, and a 30% for the current location of the user. A random item may be provided as well. For example, the user may select a 40% for the type of music currently being rendered, a 30% for the fullness of a user’s schedule, and the rest 30% for the random, which can be determined by the processor randomly. A user may specify a single weight for all types of activity that might be carried out, a single weight for all different locations the user might be in, a single weight for all types of music that might be rendered and/or a single weight for all degrees of fullness of a user’s agenda. However, in this case, the determined one or more light effects may still depend on exactly which type of activity is currently being carried out, which location the user is currently in, which type of music is currently being rendered, and/or how full the user’s schedule currently is. For example, jazz music may be associated with a different light effect range than pop music.
A step 105 comprises determining a profile based on the determined extent, i.e. the context effect. A step 107 comprises determining the one or more light effects based on the profile and the content item. Step 107 comprises analyzing the content item. In an alternative embodiment, step 107 comprises receiving features extracted from the content item from another device. A step 109 comprises causing one or more light sources to render the one or more of light effects simultaneously with the content item by transmitting one or more light commands.
In the embodiment of Fig.3 step 101 comprises a sub step 111 of determining a current location of the user and a sub step 113 of determining the type of activity currently being carried out by the user based on the current location of the user. In an alternative embodiment, the type of activity currently being carried out by the user is not determined or determined in a different manner.
In the embodiment of Fig.3 a further step 115 comprises allowing a person to specify a degree of importance of each of the one or more parameter values, i.e. to specify priorities, and step 103 comprises a sub step 117 of determining the extent based on the one or more degrees of importance. In an alternative embodiment, these one or more degrees of importance are not specified by a person, but learned from user behavior, for example.
A second embodiment of the method of the invention is shown in Fig.4. In this embodiment, a step 120 is performed as first step. Step 120 comprises creating a plurality of predefined profiles. In this embodiment, only the names are assigned in step 120 (e.g.“Effect Range 1” and“Effect Range 2”) and the content of the predefined profiles is defined in step 129.
Figs.5 and 6 show an example of a user interface for defining predefined profiles in step 129, and in particular for defining an effect range of a predefined profile for three lights that have been identified previously, e.g. lights 27-29 of Fig.1. A first part of the example of the user interface for defining an effect range is shown in Fig.5. Screen 51 allows the user (or an administrator) to define a color range per light by placing a shape in a color circle 53. The user (or the administrator) can specify characteristics of the shape, e.g. the size and/or form of the shape (in this example circles), and the location of the shape, e.g. by dragging the shape using a touch sensing portion of the display 9. In the example of Fig.5, each light is associated with a single shape, but in an alternative embodiment, a light can be associated with multiple shapes.
The area covered by a shape determines which chromaticity settings can be used in light effects rendered on a light associated with the shape. In this example, circle 55 is associated with light 27 of Fig.1 and identifier“1” and covers green and yellow tones, circle 57 is associated with light 28 of Fig.1 and identifier“2” and covers red and purple tones, and circle 59 is associated with light 29 of Fig.l and identifier“3” and covers blue tones.
A second part of the example of the user interface for defining an effect range is shown in Fig.6. Screen 61 allows the user (or the administrator) to define a dynamic range per light by placing a rectangle on a dynamic range bar 63. The user (or the administrator) can specify the height of the rectangle and the location of the rectangle, e.g. by touching two positions on the bar 63 using a touch sensing portion of the display 9. In the example of Fig.6, each light is associated with a single rectangle, but in an alternative embodiment, a light can be associated with multiple rectangles.
The part of the bar 63 covered by the rectangle determines how dynamic the light effects rendered on a light associated with the rectangle can be. In this example, rectangle 65 is associated with light 27 of Fig.l and identifier“1” and specifies the use of very dynamic light effects, rectangle 67 is associated with light 28 of Fig.l and identifier“2” and specifies the use of light effects that are not too dynamic and not too static, and rectangle 69 is associated with light 29 of Fig.l and identifier“3” and specifies the use of light effects that are only slightly dynamic, but not completely static.
In the examples of Figs.5 and 6, the effect range is defined per light. In an alternative embodiment, an effect range may be defined for a group of lights, e.g. for all the lights in a room or building.
Step 115 of Fig.3 comprises a sub step 123 in Fig.4. Step 123 comprises allowing a person to specify input relations between potential parameter values and two predefined profiles ranges defined with the user interface of Figs.5 and 6. An example of a user interface for specifying these input relations is shown in Fig.7. The two predefined profiles each comprise the effect ranges for all three lights. One such predefined profile has been defined in the example of Figs.5 and 6. The first predefined profile is represented by effect range circle 73 in Fig.7. The second predefined profile is represented by effect range circle 74 in Fig.7.
The extent to which a parameter value should be used to determine a profile depends on the amount of overlap between the circles representing input parameter values and the circles representing predefined profiles. Only a subset of the potential parameter values will be present in the input. In Fig.7, a parameter value circle 75 represents a parameter value indicating a full agenda, a parameter value circle 76 represents a parameter value indicating that jazz music is currently being played, a parameter value circle 77 represents a parameter value indicating an empty agenda, and a parameter value 79 represents a parameter value indicating that pop music is currently being played.
Furthermore, in Fig.7, a parameter value circle 78 represents the user’s preferences. Thus, the user interface of Fig.7 is also used to define the extent to which the user’s explicit preferences are taken into account. For example, the user may specify a dislike for purple colors as user preference, but this preference is not always met: the extent to which the user’s preferences are taken into account is specified with the user interface of Fig.7. If the user wants to make sure that no purple colors are used, he is able to alter the color effect ranges using the user interface of Fig.5.
The larger the overlap between a parameter value circle and an effect range circle, the likelier it is that the presence of the parameter value corresponding to this parameter value circle in the input results in the predefined profile corresponding to this effect range circle being used. Fig.7 depicts circles in a first coordinate system. In this embodiment, the size and location of the circles do not say anything about the content of the parameter values or of the predefined profiles, but are only used to determine the above- mentioned overlap. The circles may be stored in a memory as tuples in which each tuple represents a vector from the origin of the first coordinate system to the center of the circle and a radius or diameter of the circle. In an alternative embodiment, other shapes are used instead of or in addition to circles. In an alternative embodiment, the location and size of the shapes are determined fully or partly automatic.
Step 121 shown in Fig.4 is new compared to the embodiment of Fig.3. Step 121 comprises allowing a person to specify light relations between the lights 25-27 of Fig.l (identified by numbers 1-3) and the predefined profiles of Fig.7. These light relations can be represented by a basic distribution map, which is used later (in step 131). An example of a user interface for specifying this basic distribution map is shown in Fig.8. The first predefined profile of Fig.7 is represented by effect range rectangles 82 and 83. The second predefined profile of Fig.7 is represented by effect range rectangle 84. The lights 25-27 of Fig.l are represented by circles 86-88, respectively.
Fig.8 depicts a second coordinate system. In this embodiment, the size and location of the rectangles do not say anything about the content of the predefined profiles and the size and locations of the circles do not say anything about the physical size and geographical location of the lights. In an alternative embodiment, other shapes are used instead of or in addition to rectangles and circles. In an alternative embodiment, the location and size of the shapes are determined fully or partly automatic.
By determining which effect range rectangles cover a light, the effect ranges to be used for that light can be determined. However, the distribution map that is used to determine this coverage is not the basic distribution map defined in Fig.8, but a distribution map derived from this basic distribution map in step 131. This derived distribution map is derived using from the basic distribution map using the input relations specified in Fig.7 and the input.
Step 125 shown in Fig.4 is also new compared to the embodiment of Fig.3. Step 125 comprises a person selecting one or more of the parameter values represented in Fig.7 and showing a corresponding derived distribution map (derived using the selected parameter values as simulated input). If the user (or administrator) does not like the result, step 123 and/or step 121 may be repeated.
Step 103 of Fig.3 comprises a sub step 127 in the embodiment of Fig.4. Step 127 comprises determining the context effect based on the defined priorities (i.e. the defined input relations) and the context (input). Thus, the extent to which the input should be used to determine the profile is determined in this step 127. In particular, the overlap between the effect range circles 73 and 74 and the parameter values present in the input is determined.
This is illustrated with the help of Fig.9. In the example of Fig.9, the input comprises two parameter values: parameter value circle 76 represents a parameter value which indicates that jazz music is being played and parameter value circle 78 represents a parameter value identifying the user’s preferences.
The overlap area 92 between effect range circle 73 and parameter value circle 76 is determined as a percentage of the circle 73 and the overlap area 93 between effect range circle 74 and parameter value circle 78 is determined as a percentage of the circle 74. In the example of Fig.9, the overlap area 92 is 30% of the circle 73 and the overlap area 93 is 5% of the circle 74. The basic distribution map is not used in step 127, but used in the next step: step 131.
In step 105, one or more profiles are determined for the light sources to be controlled. In the embodiment of Fig.4, these profiles are derived from the predefined profiles. A profile used in step 105 may be a selection of one of these predefined profiles or may be a combination of one or more of these predefined profiles.
In the embodiment of Fig.4, step 105 ofFig.3 comprises sub steps 131,133, 135 and 137. In step 131, a current distribution map is derived from the basic distribution map defined in step 121 using the sizes of the overlap areas determined in step 127, which represent the context effect. This is illustrated with the help of Fig.10. A rectangle representing a predefined profile is enlarged or depending on the amount of overlap
Figure imgf000018_0001
determined for the predefined profile in step 127. This is done for each rectangle.
In the example of Fig.10, a rectangle is enlarged if the amount of overlap is more than 15% and if the overlap is less than 15%. In this example, the perimeters of
Figure imgf000018_0002
effect range rectangles 82 and 83 are increased by 60% ( =(ABS(l5% - 30%)*4), thereby resulting in effect range rectangles 96 and 97, respectively, and the perimeter of effect rectangle 84 is decreased by 40% ( =ABS(l5% - 5%)*4), thereby resulting in effect range rectangle 98.
Step 133 comprises determining a profile for light 27 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129. Light 27 is represented as circle 86 in Fig.10. Since circle 86 is only covered by effect range rectangle 96 corresponding to the first predefined profile, the profile for light 27 is only determined based on this first predefined profile and is equal to this first predefined profile.
Step 135 comprises determining a profile for light 28 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129. Light 28 is represented as circle 87 in Fig.10. Since circle 87 is covered by both effect range rectangle 96 corresponding to the first predefined profile and the effect range rectangle 97 corresponding to the second predefined profile, the profile for light 28 is determined based on a combination of the first predefined profile and the second predefined profile. There are several manners in which this combination may be made. For example, the determined profile may comprise the union of the color ranges represented by circles 55 and 57 of Fig.5 and the union of the dynamic ranges represented by rectangles 65 and 67 of Fig.6.
Step 137 comprises determining a profile for light 29 of Fig.1 based on the current (derived) distribution map and the predefined profiles defined in step 129. Light 29 is represented as circle 88 in Fig.10. Since circle 88 is not covered by any of the effect rectangles, the profile for light 29 indicates that it should not be used for determining light effects in step 107.
Next, the light effects are determined in step 107. Step 107 comprises analyzing the content item. In an alternative embodiment, step 107 comprises receiving features extracted from the content item from another device. Step 107 further comprises determining light effects for light 27 based on the profile determined for light 27 in step 133 and this analysis, determining light effects for light 28 based on the profile determined for light 28 in step 135 and this analysis, and determining light effects for light 29 based on the profile determined for light 29 in step 137 and this analysis. For example, the chromaticity setting and/or the brightness setting of a light may change each beat or each section of a song or may follow the rhythm of the song.
How often these settings change depends on the dynamic range specified in the profiles. Which chromaticity settings are used depends on the color range specified in the profiles. Which chromaticity settings are used within the specified color range may depend on the type of music, e.g. based on valence, and/or on other properties of the music. In an alternative embodiment, the lights effects are not only determined based on a song or an audio portion of a music video, but also on a video portion of a music video. For example, the light effects could be made to mimic the light effects people see in the music video to help them get a more immersive experience.
Finally, step 109 comprises causing light 27 of Fig.l to render the light effects determined for light 27 in step 107, causing light 28 of Fig.l to render the light effects determined for light 28 in step 107, and causing light 29 of Fig.l to render the light effects determined for light 29 in step 107 by transmitting light commands. The light commands are timed such that the light effects are rendered simultaneously with the content item, e.g. using current song position information obtained from a music player or music service.
A third embodiment of the method of the invention is shown in Fig.l 1. This third embodiment is somewhat simpler than the second embodiment of Fig.4. In this third embodiment, no distribution map is used and the profile determined for each light is based on, e.g. equal to, exactly one predefined profile. For this reason, steps 121, 125 and 131 of Fig.4 are not present in the third embodiment of Fig.l 1.
However, on which predefined profile the profiles for the individual lights are based is still selected based on the priorities (input relations) defined in step 123. In the embodiment of Fig.l 1, the overlap areas 92 and 93 of Fig.9 are determined as described in relation to the second embodiment, but they are used in a different way. A choice is made between the two predefined profiles based on the determined overlap. The predefined profile with the largest overlap area is chosen. In the example of Fig.9, the overlap area 92 is 30% and the overlap area 93 is 5% and therefore the first predefined profile corresponding to the effect range circle 73 is chosen.
Steps 163, 165 and 167 comprise determine profiles for the three lights. Each profile is determined based on the same predefined profile, but a predefined profile may specify different settings for different lights. Step 163 comprises determining a profile for light 27 of Fig.1 based on the effect ranges defined for light 27 in the first predefined profile. Step 165 comprises determining a profile for light 28 of Fig.l based on the effect ranges defined for light 28 in the first predefined profile. Step 167 comprises determining a profile for light 29 of Fig.l based on the effect ranges defined for light 29 in the first predefined profile. Next, steps 107 and 109 are performed in the manner described in relation to the second embodiment.
As previously described, a user’s color preferences can be taken into account in at least two ways: the user adapts the color ranges in the predefined profiles or the user specifies his color preferences as context. If the user specifies his color preferences as context, they may not (always) be met. The user may use voice commands to indicate his preferences, for instance. As a first example, the user may say“a bit more reddish”. This user preference (like) is taken into account to a certain extent, i.e. there is a higher chance of more reddish (warm) colors being used compared to the default setting.
As a second example, the user may say“I do not like purple”. This user preference (dislike) is taken into account to a certain extent, i.e. there is a lower chance that the color purple or a color close to the color purple is used. The input relations shown in Fig· 7 may be, for example, automatically adjusted to decrease the chance that a predefined profile is used that has the color purple in its color range.
In an alternative embodiment, determining the extent to which the input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with a content item may simply comprising adjusting the probabilities that certain colors or certain levels of dynamicity and/or brightness are used. In this embodiment, color ranges and input relations do not need to be defined. For example, a system may use an initial set of probabilities indicating a probability for each chromaticity setting, dynamicity setting and brightness setting that is used and randomly choose settings using these probabilities. If a user dislikes the color purple, the system may decrease the probability of using purple by 90% and the probability of using a color close to purple (e.g. in CIE color space, a distance of 0.01 or less) by 80%.
User preferences may also be determined in a different manner. A wearable device may measure a biological response to rendered light effects and determine preferences based on this response, for example. For instance, if a light is flashing too much (too dynamic) and this results in a change in blood pressure or even in a change in heart rate, this could be an indicator that someone strongly dislikes this light effect.
Fig.12 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Figs. 3, 4 and 11.
As shown in Fig.12, the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution. The processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in Fig.12 with a dashed line surrounding the input device 312 and the output device 314). An example of such a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”. In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
As pictured in Fig.12, the memory elements 304 may store an application 318. In various embodiments, the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 300 may further execute an operating system (not shown in Fig.12) that can facilitate execution of the application 318. The application 318, being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS:
1. A system (1) of determining light effects for a music entertainment experience from a context, comprising at least one processor (5) configured to:
- obtain an input representing said context, said input comprising one or more parameter values, said parameter values indicating a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and a current setting for rendering a content item of music,
- determine an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with the content item of music, said determining an extent to which said input should be used to determine a profile including an input or selection of percentage or weight of said one or more parameter values by the user,
- determine the profile based on said determined extent, said profile identifying a dynamicity range to be used for the one or more light effects and a color and/or brightness range to be used for said one or more light effects,
- determine said one or more light effects based on said profile and said content item of music, said content item of music determining change of color and/or brightness of said one or more light effects, and said profile determining the range of color and/or brightness and the range of the level of dynamicity, and
- cause one or more light sources to render said one or more of light effects simultaneously with said content item.
2. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to:
- determine a current location of said user, and
- determine said type of activity currently being carried out by said user based on said current location of said user.
3. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to:
- allow a person to specify a degree of importance of each of said one or more parameter values, and
- determine said extent based on said one or more degrees of importance.
4. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to:
- allow a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles,
- determine said extent based on said plurality of input relations, and
- determine said profile further based on one or more of said plurality of predefined profiles.
5. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to:
- determine a further profile based on said determined extent,
- determine one or more further light effects based on said further profile and said content item of music, and
- cause one or more further light sources to render said further light effects simultaneously with said content item of music.
6. A system (1) as claimed in claim 5, wherein said at least one processor (5) is configured to:
- allow a person to specify a light relation between said one or more light sources and each of a plurality of predefined profiles,
- allow a person to specify a further light relation between said one or more further light sources and each of said plurality of predefined profiles,
- determine said profile further based on said light relation and one or more of said plurality of predefined profiles, and
- determine said further profile further based on said further light relation and one or more of said plurality of predefined profiles.
7. A method of determining light effects for a music entertainment experience from a context, comprising:
- obtaining (101) an input representing said context, said input comprising one or more parameter values, said parameter values indicating a type of music currently being rendered, a type of activity currently being carried out by a user, the fullness of a user’s schedule, the urgency of one or more pending tasks, a user’s light effect preference, and a current setting for rendering a content item of music;
- determining (103) an extent to which said input should be used to determine a profile for determining one or more light effects to be rendered simultaneously with the content item of music, said extent to which said input should be used to determine a profile including an input or selection of percentage or weight of said one or more parameter values by the user;
- determining (105) the profile based on said determined extent, said profile identifying a dynamicity range to be used for said one or more light effects and color and/or brightness range to be used for said one ore more light effects;
- determining (107) said one or more light effects based on said profile and said content item of music, said content item of music determining change of color and/or brightness of said one or more light effects, and said profile determining the range of color and/or brightness and the range of the level of dynamicity,; and
- causing (109) one or more light sources to render said one or more of light effects simultaneously with said content item.
8. A method as claimed in claim 7, further comprising:
- determining (111) a current location of said user; and
- determining (113) said type of activity currently being carried out by said user based on said current location of said user.
9. A method as claimed in claim 7, further comprising allowing (123) a person to specify a degree of importance of each of said one or more parameter values, wherein said extent is determined based on said one or more degrees of importance.
10. A method as claimed in claim 7, further comprising allowing (123) a person to specify an input relation between said one or more parameter values and each of a plurality of predefined profiles, wherein said extent is determined based on said plurality of input relations and said profile is further determined based on one or more of said plurality of predefined profiles.
11. A computer program or suite of computer programs comprising at least one software code portion or a computer program product storing at least one software code portion, the software code portion, when run on a computer system, being configured for enabling the method of any one of claims 7 to 10 to be performed.
PCT/EP2019/068208 2018-07-09 2019-07-08 Determining light effects to be rendered simultaneously with a content item Ceased WO2020011694A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2018095035 2018-07-09
CNPCT/CN2018/095035 2018-07-09
EP18191599.2 2018-08-30
EP18191599 2018-08-30

Publications (1)

Publication Number Publication Date
WO2020011694A1 true WO2020011694A1 (en) 2020-01-16

Family

ID=67220811

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/068208 Ceased WO2020011694A1 (en) 2018-07-09 2019-07-08 Determining light effects to be rendered simultaneously with a content item

Country Status (1)

Country Link
WO (1) WO2020011694A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021160552A1 (en) * 2020-02-13 2021-08-19 Signify Holding B.V. Associating another control action with a physical control if an entertainment mode is active
EP4179848B1 (en) * 2020-07-13 2024-02-21 Signify Holding B.V. Selecting lighting devices for rendering entertainment lighting based on relative distance information
WO2025040509A1 (en) * 2023-08-22 2025-02-27 Signify Holding B.V. Generating light settings based on additional textual description generated for light scene
CN119562402A (en) * 2024-11-13 2025-03-04 广西美高实业有限公司 A control method and system for RGB lamp

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
WO2009090600A1 (en) * 2008-01-16 2009-07-23 Koninklijke Philips Electronics N.V. System and method for automatically creating an atmosphere suited to social setting and mood in an environment
US20170265270A1 (en) 2016-03-09 2017-09-14 Osram Gmbh Method of controlling lighting sources, corresponding system and computer program product

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
WO2009090600A1 (en) * 2008-01-16 2009-07-23 Koninklijke Philips Electronics N.V. System and method for automatically creating an atmosphere suited to social setting and mood in an environment
US20170265270A1 (en) 2016-03-09 2017-09-14 Osram Gmbh Method of controlling lighting sources, corresponding system and computer program product

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021160552A1 (en) * 2020-02-13 2021-08-19 Signify Holding B.V. Associating another control action with a physical control if an entertainment mode is active
EP4179848B1 (en) * 2020-07-13 2024-02-21 Signify Holding B.V. Selecting lighting devices for rendering entertainment lighting based on relative distance information
WO2025040509A1 (en) * 2023-08-22 2025-02-27 Signify Holding B.V. Generating light settings based on additional textual description generated for light scene
CN119562402A (en) * 2024-11-13 2025-03-04 广西美高实业有限公司 A control method and system for RGB lamp

Similar Documents

Publication Publication Date Title
CN111869330B (en) Rendering dynamic light scenes based on one or more light settings
WO2020011694A1 (en) Determining light effects to be rendered simultaneously with a content item
EP3152981B1 (en) Light scene creation or modification by means of lighting device usage data
CN108886863B (en) Computer-implemented method for creating dynamic light effects and controlling lighting devices in dependence of dynamic light effects
KR20170099721A (en) Server and controlling user environment method of electronic device using electronic device and at least one smart device
US10356870B2 (en) Controller for controlling a light source and method thereof
CN105785784B (en) Intelligent household scene visualization method and device
US20190102947A1 (en) Electronic device determining setting value of device based on at least one of device information or environment information and controlling method thereof
US20250168953A1 (en) Determining global and local light effect parameter values
US20170285594A1 (en) Systems and methods for control of output from light output apparatus
CN107490972A (en) The method to set up and equipment of the intelligent home device of hotel guest room
KR102282704B1 (en) Electronic device and method for playing image data
EP3928594B1 (en) Enhancing a user's recognition of a light scene
EP4205512B1 (en) A controller for mapping a light scene onto a plurality of lighting units and a method thereof
US20250301556A1 (en) Techniques for illuminating a physical space
CN102474951B (en) A lighting system and a method for determining the energy consumption of lighting scenes of the lighting system
WO2021160552A1 (en) Associating another control action with a physical control if an entertainment mode is active
JP2024502843A (en) Selecting a better input modality for user commands for light control
CN110392475A (en) Method and device for adjusting areas corresponding to smart devices based on Internet of Things
US20220377868A1 (en) Configuring a bridge with groups after addition of said bridge to a lighting system
WO2020007596A1 (en) Activating one or more light settings associated with an automatically determined name
KR20170059346A (en) Server, electronic device and method for performing internet of thing in electronic device
CN119631573A (en) Select lighting fixtures based on indicator light effects and distance between available lighting fixtures
CN119183687A (en) Control light output levels differently in different operating modes based on measured light
CN117492562A (en) An exhibition hall control method, system and storage medium combined with smart wear

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19737523

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19737523

Country of ref document: EP

Kind code of ref document: A1