US20150253974A1 - Control of large screen display using wireless portable computer interfacing with display controller - Google Patents
Control of large screen display using wireless portable computer interfacing with display controller Download PDFInfo
- Publication number
- US20150253974A1 US20150253974A1 US14/271,156 US201414271156A US2015253974A1 US 20150253974 A1 US20150253974 A1 US 20150253974A1 US 201414271156 A US201414271156 A US 201414271156A US 2015253974 A1 US2015253974 A1 US 2015253974A1
- Authority
- US
- United States
- Prior art keywords
- control device
- display device
- display
- controller
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/439—Processing of audio elementary streams
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/27—Output arrangements for video game devices characterised by a large display in a public venue, e.g. in a movie theatre, stadium or game arena
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
- H04L67/025—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
- H04L67/125—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/4222—Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43637—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4852—End-user interface for client configuration for modifying audio parameters, e.g. switching between mono and stereo
-
- H04N5/4403—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0383—Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
-
- H04N2005/4408—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
Definitions
- the application relates generally to controlling a large screen display using a wireless portable computer such as a tablet or laptop computer interfacing with a display controller such as a game console.
- a computer ecosystem or digital ecosystem, is an adaptive and distributed socio-technical system that is characterized by its sustainability, self-organization, and scalability.
- environmental ecosystems which consist of biotic and abiotic components that interact through nutrient cycles and energy flows
- complete computer ecosystems consist of hardware, software, and services that in some cases may be provided by one company, such as Sony.
- the goal of each computer ecosystem is to provide consumers with everything that may be desired, at least in part services and/or software that may be exchanged via the Internet.
- interconnectedness and sharing among elements of an ecosystem such as applications within a computing cloud, provides consumers with increased capability to organize and access data and presents itself as the future characteristic of efficient integrative ecosystems.
- An example ecosystem that is pertinent here is an entertainment ecosystem in the home or in a luxury suite at a stadium that includes a large screen high definition display controlled by a controller such as a personal computer (PC) or game console which receives commands from a portable control device such as a tablet computer.
- a controller such as a personal computer (PC) or game console which receives commands from a portable control device such as a tablet computer.
- a control device includes at least one computer readable storage medium bearing instructions executable by a processor, and at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for presenting on a display of the control device a user interface (UI) having plural selectors arranged in a layout. Each selector is established by a respective video feed.
- the instructions when executed by the processor configure the processor for receiving a first user input corresponding to a first command, and wirelessly transmitting the first command to a controller of the display device to command the controller to present plural video feeds on the display device.
- the video feeds on the control device are of the same content as respective video feeds on the display device.
- the instructions when executed by the processor configure the processor for receiving a second user input corresponding to a second command, and for wirelessly transmitting the second command to the controller to command the controller to present a single video feed full screen on the display device with the layout of the selectors on the control device remaining unchanged.
- the first user input is established by a user touch on any portion of a display of the control device, and the second user input can be established only by a user touch on a selector corresponding to the single video feed.
- the instructions when executed by the processor may configure the processor for transmitting the first command to the controller of the display device to command the controller to present the plural video feeds on the display device in a same layout as the video feeds establishing the selectors are arranged on the display of the monitoring device.
- the instructions when executed by the processor configure the processor for receiving a third user input corresponding to a third command.
- the third user input can be a drag of an image associated with a content selector and a drop of the image onto a portion of the control device for commanding the controller to cause the display device to present a video corresponding to the content selector in a portion of the display device that corresponds to the portion of the control device onto which the image was dropped.
- the control device can be established by a portable computer.
- the instructions when executed by the processor may if desired configure the processor for providing a visible indicator on the control device that the display device has been commanded to present the single video feed full screen on the display device.
- a system in another aspect, includes at least one display device configured for presenting plural video contents in respective windows of the display device, at least one controller configured for controlling the display device, and at least one control device configured for communicating commands to the controller to control presentation on the display device.
- the control device is configured to present a user interface (UI) having plural selectors arranged in a layout on a screen of the control device. Each selector is established by a respective video feed having a first resolution.
- the control device is configured for causing the display device to present plural video feeds on the display device, with the video feeds on the control device being of the same content as respective video feeds on the display device.
- the respective video feeds on the display device have second resolution higher than the first resolution.
- the control device is configured for causing a single video feed full screen to be presented on the display device with the layout of the selectors on the control device remaining unchanged.
- a method in another aspect, includes sending plural video feeds having a first resolution to a control device over a computer network. The method also includes sending plural video feeds having a second resolution to a display device over the computer network, with the second resolution being higher than the first.
- the video feeds of the first resolution are the same content as the video feeds of the second content.
- Video presentation on the display device is controlled at least in part based on user touches of the video feeds presented on the control device.
- FIG. 1 is a block diagram of an example system including an example in accordance with present principles
- FIG. 2 is a partially schematic view of a specific example system with two UHD displays mounted on a wall side by side;
- FIG. 3 is a schematic diagram illustrating control of video presentation on a display using a movable window on a control device.
- FIGS. 4-7 are each a series of screen shots illustrating various aspects of example embodiments.
- a system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components.
- the client components may include one or more computing devices including portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below.
- portable televisions e.g. smart TVs, Internet-enabled TVs
- portable computers such as laptops and tablet computers
- other mobile devices including smart phones and additional examples discussed below.
- These client devices may operate with a variety of operating environments.
- some of the client computers may employ, as examples, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple Computer or Google.
- These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers discussed below.
- Servers may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet.
- a client and server can be connected over a local intranet or a virtual private network.
- a server or controller may be instantiated by a game console such as a Sony Playstation (trademarked), a personal computer, etc.
- servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security.
- servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
- instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
- a processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
- Software modules described by way of the flow charts and user interfaces herein can include various sub-routines, procedures, etc. Without limiting the disclosure, logic stated to be executed by a particular module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
- logical blocks, modules, and circuits described below can be implemented or performed with a general purpose processor, a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
- DSP digital signal processor
- FPGA field programmable gate array
- ASIC application specific integrated circuit
- a processor can be implemented by a controller or state machine or a combination of computing devices.
- connection may establish a computer-readable medium.
- Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires.
- Such connections may include wireless communication connections including infrared and radio.
- a system having at least one of A, B, and C includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
- an example ecosystem 10 is shown, which may include one or more of the example devices mentioned above and described further below in accordance with present principles.
- the first of the example devices included in the system 10 is an example primary display device, and in the embodiment shown is an audio video display device (AVDD) 12 such as but not limited to an Internet-enabled TV.
- AVDD 12 alternatively may be an appliance or household item, e.g. computerized Internet enabled refrigerator, washer, or dryer.
- the AVDD 12 alternatively may also be a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a wearable computerized device such as e.g.
- AVDD 12 is configured to undertake present principles (e.g. communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein).
- the AVDD 12 can be established by some or all of the components shown in FIG. 1 .
- the AVDD 12 can include one or more displays 14 that may be implemented by a high definition or ultra-high definition “4K” flat screen and that may be touch-enabled for receiving user input signals via touches on the display.
- the AVDD 12 may include one or more speakers 16 for outputting audio in accordance with present principles, and at least one additional input device 18 such as e.g. an audio receiver/microphone for e.g. entering audible commands to the AVDD 12 to control the AVDD 12 .
- the example AVDD 12 may also include one or more network interfaces 20 for communication over at least one network 22 such as the Internet, an WAN, an LAN, etc. under control of one or more processors 24 .
- the interface 20 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface.
- the processor 24 controls the AVDD 12 to undertake present principles, including the other elements of the AVDD 12 described herein such as e.g. controlling the display 14 to present images thereon and receiving input therefrom.
- network interface 20 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
- the AVDD 12 may also include one or more input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the AVDD 12 for presentation of audio from the AVDD 12 to a user through the headphones.
- the AVDD 12 may further include one or more tangible computer readable storage medium 28 such as disk-based or solid state storage.
- the AVDD 12 can include a position or location receiver such as but not limited to a cellphone receiver, GPS receiver and/or altimeter 30 that is configured to e.g.
- AVDD 12 receive geographic position information from at least one satellite or cellphone tower and provide the information to the processor 24 and/or determine an altitude at which the AVDD 12 is disposed in conjunction with the processor 24 .
- a suitable position receiver other than a cellphone receiver, GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the AVDD 12 in e.g. all three dimensions.
- the AVDD 12 may include one or more cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the AVDD 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles.
- a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively.
- NFC element can be a radio frequency identification (RFID) element.
- the AVDD 12 may include one or more auxiliary sensors 37 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the processor 24 .
- the AVDD 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 40 providing input to the processor 24 .
- climate sensors 38 e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.
- biometric sensors 40 providing input to the processor 24 .
- the AVDD 12 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 42 such as an IR data association (IRDA) device.
- IR infrared
- IRDA IR data association
- a battery (not shown) may be provided for powering the AVDD 12 .
- the system 10 may include one or more other CE device types.
- a first CE device 44 may be used to control the display via commands sent through the below-described server while a second CE device 46 may include similar components as the first CE device 44 and hence will not be discussed in detail.
- a second CE device 46 may include similar components as the first CE device 44 and hence will not be discussed in detail.
- only two CE devices 44 , 46 are shown, it being understood that fewer or greater devices may be used.
- the first CE device 44 is assumed to be in the same room as the AVDD 12 , bounded by walls illustrated by dashed lines 48 .
- the example non-limiting first CE device 44 may be established by any one of the above-mentioned devices, for example, a portable wireless laptop computer or notebook computer, and accordingly may have one or more of the components described below.
- the second CE device 46 without limitation may be established by a wireless telephone.
- the first CE device 44 may include one or more displays 50 that may be touch-enabled for receiving user input signals via touches on the display.
- the first CE device 44 may include one or more speakers 52 for outputting audio in accordance with present principles, and at least one additional input device 54 such as e.g. an audio receiver/microphone for e.g. entering audible commands to the first CE device 44 to control the device 44 .
- the example first CE device 44 may also include one or more network interfaces 56 for communication over the network 22 under control of one or more CE device processors 58 .
- the interface 56 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface.
- the processor 58 controls the first CE device 44 to undertake present principles, including the other elements of the first CE device 44 described herein such as e.g. controlling the display 50 to present images thereon and receiving input therefrom.
- the network interface 56 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
- the first CE device 44 may also include one or more input ports 60 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the first CE device 44 for presentation of audio from the first CE device 44 to a user through the headphones.
- the first CE device 44 may further include one or more tangible computer readable storage medium 62 such as disk-based or solid state storage.
- the first CE device 44 can include a position or location receiver such as but not limited to a cellphone and/or GPS receiver and/or altimeter 64 that is configured to e.g.
- the CE device processor 58 receive geographic position information from at least one satellite and/or cell tower, using triangulation, and provide the information to the CE device processor 58 and/or determine an altitude at which the first CE device 44 is disposed in conjunction with the CE device processor 58 .
- another suitable position receiver other than a cellphone and/or GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the first CE device 44 in e.g. all three dimensions.
- the first CE device 44 may include one or more cameras 66 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the first CE device 44 and controllable by the CE device processor 58 to gather pictures/images and/or video in accordance with present principles.
- a Bluetooth transceiver 68 and other Near Field Communication (NFC) element 70 for communication with other devices using Bluetooth and/or NFC technology, respectively.
- NFC element can be a radio frequency identification (RFID) element.
- the first CE device 44 may include one or more auxiliary sensors 72 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the CE device processor 58 .
- the first CE device 44 may include still other sensors such as e.g. one or more climate sensors 74 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 76 providing input to the CE device processor 58 .
- climate sensors 74 e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.
- biometric sensors 76 providing input to the CE device processor 58 .
- the first CE device 44 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 78 such as an IR data association (IRDA) device.
- IR infrared
- IRDA IR data association
- a battery (not shown) may be provided for powering the first CE device 44 .
- the second CE device 46 may include some or all of the components shown for the CE device 44 .
- At least one server 80 includes at least one server processor 82 , at least one tangible computer readable storage medium 84 such as disk-based or solid state storage, and at least one network interface 86 that, under control of the server processor 82 , allows for communication with the other devices of FIG. 1 over the network 22 , and indeed may facilitate communication between servers and client devices in accordance with present principles.
- the network interface 86 may be, e.g., a wired or wireless modem or router, Wi-Fi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver.
- the server 80 may be an Internet server, and may include and perform “cloud” functions such that the devices of the system 10 may access a “cloud” environment via the server 80 in example embodiments.
- the server 80 may be implemented by a game console or other computer in the same room as the other devices shown in FIG. 1 or nearby.
- FIG. 2 shows an example system 100 in which first and second ultra high definition (UHD) displays 102 , 104 are mounted on a wall, e.g., a wall of a home or a luxury stadium box.
- the UHD displays 102 , 104 may be 4K displays.
- One or more control devices control presentation of the displays by sending commands wirelessly and/or over wired paths to one or more controllers.
- a controller 106 controls the displays 102 , 104 , it being understood that a separate controller may be provided for each display.
- content control on the first display 102 is established by a first control device 108 while content control on the second display 104 is established by a second control device 110 , it being understood that a single control device may be used to establish control on both displays.
- the control devices 108 , 110 may be, without limitation, portable computers such as tablet computers or laptop computers (also including notebook computers) or other devices with one or more of the CE device 44 components shown in FIG. 1 .
- the displays 102 , 104 may be monitors only and/or may include one or more of the primary display 14 components shown in FIG. 1 .
- the controller 106 may be a personal computer (PC) or game console or server that contains one or more of the components variously shown in FIG. 1 .
- the control devices 108 , 110 communicate directly with the controller 106 using, e.g., WiFi or Bluetooth; the control devices 108 , 110 do not communicate directly with the displays 102 , 104 .
- the controller 106 communicates with the displays 102 , 104 to establish presentation thereon in accordance with commands received from the control devices. It is to be understood that while the controller 106 is shown physically separate from the displays in FIG. 2 , it may be incorporated within the chassis of a display. As also shown, the displays may present plural contents in respective content windows 112 .
- the controller 106 may receive video from plural video cameras 114 .
- a first camera 114 may image a first half of a sports field, racetrack, or other action venue whereas as second camera 114 may image the other half of the action venue, with the feeds from the two cameras being combined before being sent to the controller 106 or combined by the controller 106 and “stitched” to present a single video view of both halves of the action venue on one or both of the displays 102 , 104 . That is, the combined feed from both cameras may be presented on a single display in an 8K mode, or the combined feed may be spread across the juxtaposed displays such that one display shows one half of the action venue and the other display shows the other half.
- the feeds sent to the controller preferably are HD or more preferably UHD.
- the cameras 114 can present the same video feeds albeit at a lower resolution to the control devices 108 , 110 .
- the UHD feeds may be sent to the controller 106 over a network from a network address while the lower resolution feeds of the same content may be simultaneously sent to the control devices 108 , 110 over the network from the same or a different network address, such that the video content on the control devices is the same as the video content presentable on the displays, albeit typically of a lower resolution.
- a dedicated local server or PS4 may not be required in some embodiments to manage the 4K and thumbnail feeds as well as analyze the commands coming from the tablet. Instead, this can happen in the cloud with the 4K TV and tablet having their own MAC address and the cloud server acting as though it were local to permit control of 4K monitors in remote locations as well.
- a location sensing system such as any of those described above may be used to determine where the control device is relative multiple 4K display locations to allow the user to roam and have the 4K content follow him. This provides for multiple 4K clusters in a stadium suite, each showing the same or different content. In this case what is showing on a particular 4K TV cluster can drive the UI on the tablet, or the other way around.
- a control device such as the control device 108 shown in FIG. 2 presents on its display a user interface (UI) 116 of a user interface (UI) presenting a video image of content and a border 118 superimposed on a portion 120 of the video image which is smaller than the video image as shown.
- UI user interface
- FIG. 3 assume the video image in FIG. 3 is of a football game, with “Os” representing offensive players and “Xs” representing defensive players.
- the control device When a user has instantiated the border 118 (by, e.g., selecting a “pan and zoom” selector 122 ), the control device in response sends a command to the controller to cause a large display such the display 102 in FIG. 2 to present on the display 102 only the (higher definition) portion of the content enclosed in the border 118 on the control device 108 .
- the user has positioned the border 118 on the control device 108 over two offensive players and two defensive players with subscripts “1” to distinguish them from the other player symbols in the figure.
- control device has commanded the controller to present on the display device 102 only the content enclosed by the border on the control device, in the example shown, to present only the two offensive players and two defensive players with subscripts “1”. It will readily be appreciated that the controller further has zoomed the video presentation on the demanded portion to substantially fill the entire screen of the display device 102 .
- the screen of the control device 108 is a touch screen display, and a user may touch the border 118 and/or portion enclosed thereby and drag (as indicated by the arrow 126 ) the border to a new portion of the video as indicated by the dotted line box 128 , releasing the user touch once the border has been dragged to the desired part of the video shown on the control device.
- the new portion 128 two defensive players “X” are shown, denoted by subscripts “2” to distinguish them.
- this drag and drop causes the controller to pan the zoomed video from the first portion to the second portion in the direction of the drag until the second portion of the (higher definition) video substantially fills the screen of the display device 102 as shown at 132 in the figure.
- the content related to the video image on the display device 102 is entirely established, in temporal sequence, by a zoomed presentation of the first portion, then a moving pan across at least part of the video image on the display device in concert with the user input to move the border to the second portion of the video image on the control device, to end at a zoomed presentation on the display device of the second portion.
- control device presents both the entire video image of the content and the border 118 superimposed on the portion 120 of the video image as the user input causes the border to move across the video image of the content, whereas the display device 102 is caused to present only content from the video image corresponding to content 120 within the border 118 on the control device.
- HTML5 may be used along with JavaScript (including some JavaScript libraries), and CSS in one implementation.
- Video files may be stored locally on the control device and played in the browser using the video tag of HTML5.
- Live streaming files from a local streaming server, streaming files from internet and live tuner signal can also be used as the source.
- a user drags and drops a tile; based on the id of the tile, the path of the video in the quad portion (on which the tile id dropped) of the display presenting video in four quadrants selected is changed to the correct video and this new video is played.
- a full screen API may not be used since it requires user interaction to allow full screen on the control device.
- all videos can be paused, then the video selected can be scaled by the browser to 4K resolution. If a 4K file is present, the 4K file is used, then no browser scaling is needed.
- Websocket may be used to communicate through IP from the control device to the controller to control the display device. Messages may be broadcast to all the display devices, then each display device browser can use the message it needs. Drag and drop can be done using the jQuery UI library, and scrolling can be done using CSS position updating.
- the stitch image zoom effect can be done by drawing video on the HTML5 canvas, sending coordinates from control device to the controller so the controller knows which portion of the video to zoom on in the display device.
- a phone application may also be implemented in HTML5, allowing audio files from the server to be played on a speaker, e.g., of the display device or other device, through IP.
- the phone application audio matches the audio for the four videos played in the quad view, and each audio file can be selected for playback.
- an external device connected to a different HDMI input of the display device such as video disk player, a satellite feed, etc.
- the control device may send IP commands to the display device (via the controller) to change input. If a tile corresponding to a video is drag and dropped, another IP command can be sent to the display device (via the controller) to change input back to PC and/or controller and the video file selected is played from the PC and/or controller.
- FIGS. 4-7 illustrate additional principles that may be used according to present principles.
- one of the display devices 102 from FIG. 2 and one of the control devices 108 are shown schematically, and on the right side of FIG. 4 the same devices are shown schematically after an operation.
- a multi-view channel has been launched on the display device 102 , resulting in four video windows 400 (“quad view”) being presented on the display device 102 .
- Each window 400 presents a video stream from a respective audio video (AV) program P 1 -P 4 from an Internet or broadcast source (cable, satellite, etc.)
- the audio from only one of the AV programs is played by the display device.
- the windows 400 may be substantially identical in size if desired, as shown.
- a UI 402 is presented on the display of the control device 108 .
- the UI 402 includes plural main selectors 404 arranged in a layout, preferably the same layout as the windows 400 on the display device 102 as shown.
- Each main selector 404 is established by a respective video feed, in the example shown, the same content albeit perhaps in lower resolution as the four videos in the quad view of the display device 102 , as duly indicated by use of the same video program designators P 1 -P 4 .
- the UI 402 may further include a row 406 of additional content selectors 408 apart from the programs P 1 -P 4 shown in the main selectors 404 , although in the embodiment shown, for ease of disclosure, the same four programs P 1 -P 4 establish the first four content selectors 408 in the row 406 , while the last two content selectors indicate they may be selected to present content from two additional programs P 5 and P 6 .
- the content selectors 408 in the row 406 may be established by still image thumbnails.
- a column 410 of audio selectors 412 may be presented on the UI 402 .
- Each volume selector 412 in the column 410 may correspond to a respective content in the content selectors 408 in the row 406 .
- Each audio selector 412 may include a respective audio on/off symbol 414 , with all of the symbols 414 except one having a line through them indicating that the audio represented by those selectors is not being played on the display device 102 .
- the symbol 414 of the top audio selector 412 does not have a line through it, indicating that the audio from the program associated with the top selector, in the example shown, program P 1 , is being played on the display device 102 .
- Touching an audio selector 412 on the control device 108 causes the control device to command the controller to switch audio play on the display device 102 to the audio represented by the touched audio selector on the control device 108 . This also causes the line through the respective symbol 414 of the touched selector to be removed, and a line placed onto the symbol 414 of the selector 412 representing the replaced audio.
- FIG. 4 illustrates the dragging and dropping one of the content selectors 408 onto a main selector 404 changes the video in that main selector 404 to the video represented by the dragged and dropped content selector 408 .
- the UI 402 on the control device 108 thus change, but also, as indicated in the top right portion of FIG. 4 , the video presented in the window 400 (in this, the top left window) of the display device 102 is also caused to change to the video represented by the dragged and dropped content selector 408 .
- control device 108 responsive to the drag and drop, obtaining the network address or channel number of the video represented by the dragged and dropped content selector 408 and commanding the controller to present video from that network address in the window 400 corresponding to the main selector 404 onto which the content selector 408 was dragged and dropped.
- the main selectors 404 on the control device 108 mirror the windows 400 on the display device 102 .
- a user may move his hand left or right on the row 406 of content selectors to cause selectors for additional content to scroll onto the display of the control device.
- the new content typically includes additional program channels or Internet content related to the theme of the programming presented on the display device 102 , including, for example, sports statistics related to a sporting event in one of the windows 400 .
- FIG. 5 illustrates that “throwing” one of the main selectors 404 on the control device 108 “to” the display device 102 causes the display device to switch to a full screen presentation 500 (shown on the right of FIG. 5 ) of the content represented by the “thrown” main selector 404 .
- the processor of the control device 108 may infer that a main selector 404 has been “thrown” by a user dragging the main selector upwards toward the top of the control device, responsive to which the control device 108 sends a command to the controller to present the associated content full screen on the display device 102 .
- commanding the display device 102 into the full screen mode as described above may result in the main selectors 404 on the control device 108 merging into a single large selector with the same content shown full screen on the display device.
- the row 406 of content selectors 406 and column 410 of audio selectors 412 can remain unchanged.
- This single large main selector in the area formerly occupied by the four main selectors may be touched to cause the display device (pursuant to a command from the control device) to revert to the quad view shown on the left side of FIG. 5 and to also cause the main selectors 404 on the control device 108 to mirror the display device views, in this case, to resume the four main selector quad view shown on the bottom left of FIG. 4 .
- throwing a main selector 404 to cause the display device to enter full screen mode as described may not alter the appearance of the main selectors 404 , which can remain in the quad view shown on the control device. Subsequently touching any one of the main selectors 404 on the control device may result in the control device commanding the controller to resume the quad view presentation on the display device. Or, if desired, as shown on the bottom right of FIG. 5 , when the control device has been configured to command the display device to enter full screen mode, with the four main selectors 404 remaining on the control device UI, touching one of the main selectors may cause the corresponding content to be presented full screen on the control device.
- the content presented full screen on the display device may be the same or different than the content presented full screen on the control device, depending on what main selector 404 was thrown to the display device and what main selector 404 subsequently was touched by a user.
- a subsequent touch anywhere on the control device screen may cause the control device and display device to resume the layouts shown on the left side of FIG. 5 .
- a user may simply touch the content selector 408 . This may cause a detail screen 600 to appear on the control device 108 (but not on the display device 102 ), so that a person controlling presentation on the display device by means of the control device 108 can observe the detailed information about content prior to presenting the content (or information about the content) on the display device 102 .
- FIG. 7 is a series of screen shots on the control device 108 to illustrate further optional details.
- a remote control selector 702 may be presented which if touched as indicated at 704 can cause a remote-control like presentation 706 to be presented on the control device 108 .
- the presentation 706 emulates a standard remote control, with alpha-numeric touch-enabled input selectors, channel and volume up/down selectors, and cursor arrow selectors as shown. Touching the selector 702 again causes the initial presentation (upper right panel in FIG. 7 ) to resume on the control device.
- main selectors 710 corresponding to the quad view of a first one of the display devices 102 may be presented and used to control that display device according to principles above. Also, a smaller quad view of alternate main selectors 712 may be presented, representing content being presented on the second display device 102 .
- a user need only touch the smaller quad view of alternate main selectors 712 as shown at 714 , causing the smaller quad view of alternate main selectors 712 to animate to an enlarged configuration 716 and the quad view of main selectors corresponding to the first one of the display devices 102 to animate to become smaller in size as shown at 718 .
- the enlarged configuration of main selectors on the control device 108 appertaining to the second display device 102 may then be used to control the presentation on the second display device according to principles above.
- the bottom two screen shots in FIG. 7 show that tapping 720 on a content selector can cause a detail presentation 722 to be shown on the control device, showing the details of the content represented by the tapped content selector. Also, as shown at 724 , instead of a column of audio selectors as described previously, respective audio on/off symbols analogous to the symbols 414 in FIG. 4 may be presented on each main selector 726 and if touched cause the control device to command the controller to replace the associated audio with the touched symbol to replace the audio currently being played on the display device being controlled.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A multi-window user interface (UI) on a control device such as a tablet computer communicates commands to a display controller, which may be implemented by a game console. The controller controls presentation on a large screen display according to the commands. The control device may always present a multi-window UI which the control device can command to have mimicked on the display, with the UI also being configured to cause a single video to be presented whole screen on the display while the multi-window UI remains presented on the control device. The video feeds that populate the multi-window UI are of the same content but of lower resolution than those which are presented on the display. A movable window on the control device can be dragged and dropped to cause a corresponding magnifying focus on the display over the corresponding portion of video in focus on the control device.
Description
- The application relates generally to controlling a large screen display using a wireless portable computer such as a tablet or laptop computer interfacing with a display controller such as a game console.
- A computer ecosystem, or digital ecosystem, is an adaptive and distributed socio-technical system that is characterized by its sustainability, self-organization, and scalability. Inspired by environmental ecosystems, which consist of biotic and abiotic components that interact through nutrient cycles and energy flows, complete computer ecosystems consist of hardware, software, and services that in some cases may be provided by one company, such as Sony. The goal of each computer ecosystem is to provide consumers with everything that may be desired, at least in part services and/or software that may be exchanged via the Internet. Moreover, interconnectedness and sharing among elements of an ecosystem, such as applications within a computing cloud, provides consumers with increased capability to organize and access data and presents itself as the future characteristic of efficient integrative ecosystems.
- Two general types of computer ecosystems exist: vertical and horizontal computer ecosystems. In the vertical approach, virtually all aspects of the ecosystem are owned and controlled by one company, and are specifically designed to seamlessly interact with one another. Horizontal ecosystems, one the other hand, integrate aspects such as hardware and software that are created by other entities into one unified ecosystem. The horizontal approach allows for greater variety of input from consumers and manufactures, increasing the capacity for novel innovations and adaptations to changing demands.
- An example ecosystem that is pertinent here is an entertainment ecosystem in the home or in a luxury suite at a stadium that includes a large screen high definition display controlled by a controller such as a personal computer (PC) or game console which receives commands from a portable control device such as a tablet computer.
- Accordingly, a control device includes at least one computer readable storage medium bearing instructions executable by a processor, and at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for presenting on a display of the control device a user interface (UI) having plural selectors arranged in a layout. Each selector is established by a respective video feed. The instructions when executed by the processor configure the processor for receiving a first user input corresponding to a first command, and wirelessly transmitting the first command to a controller of the display device to command the controller to present plural video feeds on the display device. The video feeds on the control device are of the same content as respective video feeds on the display device. The instructions when executed by the processor configure the processor for receiving a second user input corresponding to a second command, and for wirelessly transmitting the second command to the controller to command the controller to present a single video feed full screen on the display device with the layout of the selectors on the control device remaining unchanged.
- In some embodiments, the first user input is established by a user touch on any portion of a display of the control device, and the second user input can be established only by a user touch on a selector corresponding to the single video feed.
- If desired, the instructions when executed by the processor may configure the processor for transmitting the first command to the controller of the display device to command the controller to present the plural video feeds on the display device in a same layout as the video feeds establishing the selectors are arranged on the display of the monitoring device. In some examples, the instructions when executed by the processor configure the processor for receiving a third user input corresponding to a third command. The third user input can be a drag of an image associated with a content selector and a drop of the image onto a portion of the control device for commanding the controller to cause the display device to present a video corresponding to the content selector in a portion of the display device that corresponds to the portion of the control device onto which the image was dropped.
- The control device can be established by a portable computer.
- The instructions when executed by the processor may if desired configure the processor for providing a visible indicator on the control device that the display device has been commanded to present the single video feed full screen on the display device.
- In another aspect, a system includes at least one display device configured for presenting plural video contents in respective windows of the display device, at least one controller configured for controlling the display device, and at least one control device configured for communicating commands to the controller to control presentation on the display device. The control device is configured to present a user interface (UI) having plural selectors arranged in a layout on a screen of the control device. Each selector is established by a respective video feed having a first resolution. The control device is configured for causing the display device to present plural video feeds on the display device, with the video feeds on the control device being of the same content as respective video feeds on the display device. The respective video feeds on the display device have second resolution higher than the first resolution. Furthermore, the control device is configured for causing a single video feed full screen to be presented on the display device with the layout of the selectors on the control device remaining unchanged.
- In another aspect, a method includes sending plural video feeds having a first resolution to a control device over a computer network. The method also includes sending plural video feeds having a second resolution to a display device over the computer network, with the second resolution being higher than the first. The video feeds of the first resolution are the same content as the video feeds of the second content. Video presentation on the display device is controlled at least in part based on user touches of the video feeds presented on the control device.
- The details of the present invention, both as to its structure and operation, can be best understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
-
FIG. 1 is a block diagram of an example system including an example in accordance with present principles; -
FIG. 2 is a partially schematic view of a specific example system with two UHD displays mounted on a wall side by side; -
FIG. 3 is a schematic diagram illustrating control of video presentation on a display using a movable window on a control device; and -
FIGS. 4-7 are each a series of screen shots illustrating various aspects of example embodiments. - This disclosure relates generally to computer ecosystems including aspects of consumer electronics (CE) device based user information in computer ecosystems. A system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple Computer or Google. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers discussed below.
- Servers may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or, a client and server can be connected over a local intranet or a virtual private network. A server or controller may be instantiated by a game console such as a Sony Playstation (trademarked), a personal computer, etc.
- Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
- As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
- A processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
- Software modules described by way of the flow charts and user interfaces herein can include various sub-routines, procedures, etc. Without limiting the disclosure, logic stated to be executed by a particular module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
- Present principles described herein can be implemented as hardware, software, firmware, or combinations thereof; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
- Further to what has been alluded to above, logical blocks, modules, and circuits described below can be implemented or performed with a general purpose processor, a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.
- The functions and methods described below, when implemented in software, can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) or other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.
- Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
- “A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
- Now specifically referring to
FIG. 1 , an example ecosystem 10 is shown, which may include one or more of the example devices mentioned above and described further below in accordance with present principles. The first of the example devices included in the system 10 is an example primary display device, and in the embodiment shown is an audio video display device (AVDD) 12 such as but not limited to an Internet-enabled TV. Thus, theAVDD 12 alternatively may be an appliance or household item, e.g. computerized Internet enabled refrigerator, washer, or dryer. TheAVDD 12 alternatively may also be a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a wearable computerized device such as e.g. computerized Internet-enabled watch, a computerized Internet-enabled bracelet, other computerized Internet-enabled devices, a computerized Internet-enabled music player, computerized Internet-enabled head phones, a computerized Internet-enabled implantable device such as an implantable skin device, etc. Regardless, it is to be understood that theAVDD 12 is configured to undertake present principles (e.g. communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein). - Accordingly, to undertake such principles the
AVDD 12 can be established by some or all of the components shown inFIG. 1 . For example, theAVDD 12 can include one ormore displays 14 that may be implemented by a high definition or ultra-high definition “4K” flat screen and that may be touch-enabled for receiving user input signals via touches on the display. TheAVDD 12 may include one ormore speakers 16 for outputting audio in accordance with present principles, and at least oneadditional input device 18 such as e.g. an audio receiver/microphone for e.g. entering audible commands to theAVDD 12 to control theAVDD 12. Theexample AVDD 12 may also include one or more network interfaces 20 for communication over at least onenetwork 22 such as the Internet, an WAN, an LAN, etc. under control of one ormore processors 24. Thus, theinterface 20 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface. It is to be understood that theprocessor 24 controls theAVDD 12 to undertake present principles, including the other elements of theAVDD 12 described herein such as e.g. controlling thedisplay 14 to present images thereon and receiving input therefrom. Furthermore, note thenetwork interface 20 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc. - In addition to the foregoing, the
AVDD 12 may also include one ormore input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the AVDD 12 for presentation of audio from the AVDD 12 to a user through the headphones. TheAVDD 12 may further include one or more tangible computerreadable storage medium 28 such as disk-based or solid state storage. Also in some embodiments, theAVDD 12 can include a position or location receiver such as but not limited to a cellphone receiver, GPS receiver and/oraltimeter 30 that is configured to e.g. receive geographic position information from at least one satellite or cellphone tower and provide the information to theprocessor 24 and/or determine an altitude at which theAVDD 12 is disposed in conjunction with theprocessor 24. However, it is to be understood that that another suitable position receiver other than a cellphone receiver, GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of theAVDD 12 in e.g. all three dimensions. - Continuing the description of the
AVDD 12, in some embodiments theAVDD 12 may include one ormore cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into theAVDD 12 and controllable by theprocessor 24 to gather pictures/images and/or video in accordance with present principles. Also included on theAVDD 12 may be aBluetooth transceiver 34 and other Near Field Communication (NFC)element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element. - Further still, the
AVDD 12 may include one or more auxiliary sensors 37 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to theprocessor 24. TheAVDD 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or morebiometric sensors 40 providing input to theprocessor 24. In addition to the foregoing, it is noted that theAVDD 12 may also include an infrared (IR) transmitter and/or IR receiver and/orIR transceiver 42 such as an IR data association (IRDA) device. A battery (not shown) may be provided for powering theAVDD 12. - Still referring to
FIG. 1 , in addition to theAVDD 12, the system 10 may include one or more other CE device types. In one example, afirst CE device 44 may be used to control the display via commands sent through the below-described server while asecond CE device 46 may include similar components as thefirst CE device 44 and hence will not be discussed in detail. In the example shown, only two 44, 46 are shown, it being understood that fewer or greater devices may be used.CE devices - In the example shown, to illustrate present principles all three
12, 44, 46 are assumed to be members of an entertainment network in, e.g., a luxury suite of the stadium, or in a home, or at least to be present in proximity to each other in a location such as a house. However, for illustrating present principles thedevices first CE device 44 is assumed to be in the same room as theAVDD 12, bounded by walls illustrated by dashedlines 48. - The example non-limiting
first CE device 44 may be established by any one of the above-mentioned devices, for example, a portable wireless laptop computer or notebook computer, and accordingly may have one or more of the components described below. Thesecond CE device 46 without limitation may be established by a wireless telephone. - The
first CE device 44 may include one ormore displays 50 that may be touch-enabled for receiving user input signals via touches on the display. Thefirst CE device 44 may include one ormore speakers 52 for outputting audio in accordance with present principles, and at least oneadditional input device 54 such as e.g. an audio receiver/microphone for e.g. entering audible commands to thefirst CE device 44 to control thedevice 44. The examplefirst CE device 44 may also include one or more network interfaces 56 for communication over thenetwork 22 under control of one or moreCE device processors 58. Thus, theinterface 56 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface. It is to be understood that theprocessor 58 controls thefirst CE device 44 to undertake present principles, including the other elements of thefirst CE device 44 described herein such as e.g. controlling thedisplay 50 to present images thereon and receiving input therefrom. Furthermore, note thenetwork interface 56 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc. - In addition to the foregoing, the
first CE device 44 may also include one ormore input ports 60 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to thefirst CE device 44 for presentation of audio from thefirst CE device 44 to a user through the headphones. Thefirst CE device 44 may further include one or more tangible computerreadable storage medium 62 such as disk-based or solid state storage. Also in some embodiments, thefirst CE device 44 can include a position or location receiver such as but not limited to a cellphone and/or GPS receiver and/oraltimeter 64 that is configured to e.g. receive geographic position information from at least one satellite and/or cell tower, using triangulation, and provide the information to theCE device processor 58 and/or determine an altitude at which thefirst CE device 44 is disposed in conjunction with theCE device processor 58. However, it is to be understood that that another suitable position receiver other than a cellphone and/or GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of thefirst CE device 44 in e.g. all three dimensions. - Continuing the description of the
first CE device 44, in some embodiments thefirst CE device 44 may include one ormore cameras 66 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into thefirst CE device 44 and controllable by theCE device processor 58 to gather pictures/images and/or video in accordance with present principles. Also included on thefirst CE device 44 may be aBluetooth transceiver 68 and other Near Field Communication (NFC)element 70 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element. - Further still, the
first CE device 44 may include one or more auxiliary sensors 72 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to theCE device processor 58. Thefirst CE device 44 may include still other sensors such as e.g. one or more climate sensors 74 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or morebiometric sensors 76 providing input to theCE device processor 58. In addition to the foregoing, it is noted that in some embodiments thefirst CE device 44 may also include an infrared (IR) transmitter and/or IR receiver and/orIR transceiver 78 such as an IR data association (IRDA) device. A battery (not shown) may be provided for powering thefirst CE device 44. - The
second CE device 46 may include some or all of the components shown for theCE device 44. - Now in reference to the afore-mentioned at least one
server 80, it includes at least oneserver processor 82, at least one tangible computerreadable storage medium 84 such as disk-based or solid state storage, and at least onenetwork interface 86 that, under control of theserver processor 82, allows for communication with the other devices ofFIG. 1 over thenetwork 22, and indeed may facilitate communication between servers and client devices in accordance with present principles. Note that thenetwork interface 86 may be, e.g., a wired or wireless modem or router, Wi-Fi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver. - Accordingly, in some embodiments the
server 80 may be an Internet server, and may include and perform “cloud” functions such that the devices of the system 10 may access a “cloud” environment via theserver 80 in example embodiments. Or, theserver 80 may be implemented by a game console or other computer in the same room as the other devices shown inFIG. 1 or nearby. -
FIG. 2 shows anexample system 100 in which first and second ultra high definition (UHD) displays 102, 104 are mounted on a wall, e.g., a wall of a home or a luxury stadium box. The UHD displays 102, 104 may be 4K displays. One or more control devices control presentation of the displays by sending commands wirelessly and/or over wired paths to one or more controllers. In the non-limiting example shown, acontroller 106 controls the 102, 104, it being understood that a separate controller may be provided for each display. In the non-limiting example shown, content control on thedisplays first display 102 is established by afirst control device 108 while content control on thesecond display 104 is established by asecond control device 110, it being understood that a single control device may be used to establish control on both displays. - The
108, 110 may be, without limitation, portable computers such as tablet computers or laptop computers (also including notebook computers) or other devices with one or more of thecontrol devices CE device 44 components shown inFIG. 1 . The 102, 104 may be monitors only and/or may include one or more of thedisplays primary display 14 components shown inFIG. 1 . Thecontroller 106 may be a personal computer (PC) or game console or server that contains one or more of the components variously shown inFIG. 1 . In the non-limiting example shown, the 108, 110 communicate directly with thecontrol devices controller 106 using, e.g., WiFi or Bluetooth; the 108, 110 do not communicate directly with thecontrol devices 102, 104. Instead, thedisplays controller 106 communicates with the 102, 104 to establish presentation thereon in accordance with commands received from the control devices. It is to be understood that while thedisplays controller 106 is shown physically separate from the displays inFIG. 2 , it may be incorporated within the chassis of a display. As also shown, the displays may present plural contents inrespective content windows 112. - The
controller 106 may receive video fromplural video cameras 114. In the stadium context, afirst camera 114 may image a first half of a sports field, racetrack, or other action venue whereas assecond camera 114 may image the other half of the action venue, with the feeds from the two cameras being combined before being sent to thecontroller 106 or combined by thecontroller 106 and “stitched” to present a single video view of both halves of the action venue on one or both of the 102, 104. That is, the combined feed from both cameras may be presented on a single display in an 8K mode, or the combined feed may be spread across the juxtaposed displays such that one display shows one half of the action venue and the other display shows the other half. It will be appreciated that the feeds sent to the controller preferably are HD or more preferably UHD.displays - As well, the cameras 114 (through appropriate image processing down-resolution components) can present the same video feeds albeit at a lower resolution to the
108, 110. The UHD feeds may be sent to thecontrol devices controller 106 over a network from a network address while the lower resolution feeds of the same content may be simultaneously sent to the 108, 110 over the network from the same or a different network address, such that the video content on the control devices is the same as the video content presentable on the displays, albeit typically of a lower resolution.control devices - Note that a dedicated local server or PS4 may not be required in some embodiments to manage the 4K and thumbnail feeds as well as analyze the commands coming from the tablet. Instead, this can happen in the cloud with the 4K TV and tablet having their own MAC address and the cloud server acting as though it were local to permit control of 4K monitors in remote locations as well.
- A location sensing system such as any of those described above may be used to determine where the control device is relative multiple 4K display locations to allow the user to roam and have the 4K content follow him. This provides for multiple 4K clusters in a stadium suite, each showing the same or different content. In this case what is showing on a particular 4K TV cluster can drive the UI on the tablet, or the other way around.
- In
FIG. 3 , a control device such as thecontrol device 108 shown inFIG. 2 presents on its display a user interface (UI) 116 of a user interface (UI) presenting a video image of content and aborder 118 superimposed on aportion 120 of the video image which is smaller than the video image as shown. For illustration, assume the video image inFIG. 3 is of a football game, with “Os” representing offensive players and “Xs” representing defensive players. - When a user has instantiated the border 118 (by, e.g., selecting a “pan and zoom” selector 122), the control device in response sends a command to the controller to cause a large display such the
display 102 inFIG. 2 to present on thedisplay 102 only the (higher definition) portion of the content enclosed in theborder 118 on thecontrol device 108. In the illustration shown, the user has positioned theborder 118 on thecontrol device 108 over two offensive players and two defensive players with subscripts “1” to distinguish them from the other player symbols in the figure. In response, the control device, as indicated by thearrow 124, has commanded the controller to present on thedisplay device 102 only the content enclosed by the border on the control device, in the example shown, to present only the two offensive players and two defensive players with subscripts “1”. It will readily be appreciated that the controller further has zoomed the video presentation on the demanded portion to substantially fill the entire screen of thedisplay device 102. - In one example, the screen of the
control device 108 is a touch screen display, and a user may touch theborder 118 and/or portion enclosed thereby and drag (as indicated by the arrow 126) the border to a new portion of the video as indicated by the dottedline box 128, releasing the user touch once the border has been dragged to the desired part of the video shown on the control device. In thenew portion 128, two defensive players “X” are shown, denoted by subscripts “2” to distinguish them. As indicated by thearrow 130, this drag and drop causes the controller to pan the zoomed video from the first portion to the second portion in the direction of the drag until the second portion of the (higher definition) video substantially fills the screen of thedisplay device 102 as shown at 132 in the figure. - Thus, responsive to the drag and drop of the
border 118 on thecontrol device 108, the content related to the video image on thedisplay device 102 is entirely established, in temporal sequence, by a zoomed presentation of the first portion, then a moving pan across at least part of the video image on the display device in concert with the user input to move the border to the second portion of the video image on the control device, to end at a zoomed presentation on the display device of the second portion. During the drag and drop process, the control device presents both the entire video image of the content and theborder 118 superimposed on theportion 120 of the video image as the user input causes the border to move across the video image of the content, whereas thedisplay device 102 is caused to present only content from the video image corresponding to content 120 within theborder 118 on the control device. - HTML5 may be used along with JavaScript (including some JavaScript libraries), and CSS in one implementation. Video files may be stored locally on the control device and played in the browser using the video tag of HTML5. Live streaming files from a local streaming server, streaming files from internet and live tuner signal can also be used as the source. To select a different file, a user drags and drops a tile; based on the id of the tile, the path of the video in the quad portion (on which the tile id dropped) of the display presenting video in four quadrants selected is changed to the correct video and this new video is played. A full screen API may not be used since it requires user interaction to allow full screen on the control device. Accordingly, as a workaround for full screen, all videos can be paused, then the video selected can be scaled by the browser to 4K resolution. If a 4K file is present, the 4K file is used, then no browser scaling is needed. Websocket may be used to communicate through IP from the control device to the controller to control the display device. Messages may be broadcast to all the display devices, then each display device browser can use the message it needs. Drag and drop can be done using the jQuery UI library, and scrolling can be done using CSS position updating. The stitch image zoom effect can be done by drawing video on the HTML5 canvas, sending coordinates from control device to the controller so the controller knows which portion of the video to zoom on in the display device.
- A phone application may also be implemented in HTML5, allowing audio files from the server to be played on a speaker, e.g., of the display device or other device, through IP. The phone application audio matches the audio for the four videos played in the quad view, and each audio file can be selected for playback. When selecting an external device connected to a different HDMI input of the display device (such as video disk player, a satellite feed, etc.), when a user drags the appropriate tile for the external device, the control device may send IP commands to the display device (via the controller) to change input. If a tile corresponding to a video is drag and dropped, another IP command can be sent to the display device (via the controller) to change input back to PC and/or controller and the video file selected is played from the PC and/or controller.
-
FIGS. 4-7 illustrate additional principles that may be used according to present principles. On the left side ofFIG. 4 , one of thedisplay devices 102 fromFIG. 2 and one of thecontrol devices 108 are shown schematically, and on the right side ofFIG. 4 the same devices are shown schematically after an operation. As shown on the left side ofFIG. 4 , a multi-view channel has been launched on thedisplay device 102, resulting in four video windows 400 (“quad view”) being presented on thedisplay device 102. Eachwindow 400 presents a video stream from a respective audio video (AV) program P1-P4 from an Internet or broadcast source (cable, satellite, etc.) The audio from only one of the AV programs is played by the display device. Thewindows 400 may be substantially identical in size if desired, as shown. - A
UI 402 is presented on the display of thecontrol device 108. As shown, theUI 402 includes pluralmain selectors 404 arranged in a layout, preferably the same layout as thewindows 400 on thedisplay device 102 as shown. Eachmain selector 404 is established by a respective video feed, in the example shown, the same content albeit perhaps in lower resolution as the four videos in the quad view of thedisplay device 102, as duly indicated by use of the same video program designators P1-P4. - The
UI 402 may further include arow 406 ofadditional content selectors 408 apart from the programs P1-P4 shown in themain selectors 404, although in the embodiment shown, for ease of disclosure, the same four programs P1-P4 establish the first fourcontent selectors 408 in therow 406, while the last two content selectors indicate they may be selected to present content from two additional programs P5 and P6. In some embodiments, unlike themain selectors 404, which recall are established by moving video, thecontent selectors 408 in therow 406 may be established by still image thumbnails. - Furthermore, a
column 410 ofaudio selectors 412 may be presented on theUI 402. Eachvolume selector 412 in thecolumn 410 may correspond to a respective content in thecontent selectors 408 in therow 406. Eachaudio selector 412 may include a respective audio on/offsymbol 414, with all of thesymbols 414 except one having a line through them indicating that the audio represented by those selectors is not being played on thedisplay device 102. In contrast, in the example shown thesymbol 414 of thetop audio selector 412 does not have a line through it, indicating that the audio from the program associated with the top selector, in the example shown, program P1, is being played on thedisplay device 102. Touching anaudio selector 412 on thecontrol device 108 causes the control device to command the controller to switch audio play on thedisplay device 102 to the audio represented by the touched audio selector on thecontrol device 108. This also causes the line through therespective symbol 414 of the touched selector to be removed, and a line placed onto thesymbol 414 of theselector 412 representing the replaced audio. - The right side of
FIG. 4 illustrates the dragging and dropping one of thecontent selectors 408 onto amain selector 404 changes the video in thatmain selector 404 to the video represented by the dragged and droppedcontent selector 408. Not only does theUI 402 on thecontrol device 108 thus change, but also, as indicated in the top right portion ofFIG. 4 , the video presented in the window 400 (in this, the top left window) of thedisplay device 102 is also caused to change to the video represented by the dragged and droppedcontent selector 408. This may be done by thecontrol device 108, responsive to the drag and drop, obtaining the network address or channel number of the video represented by the dragged and droppedcontent selector 408 and commanding the controller to present video from that network address in thewindow 400 corresponding to themain selector 404 onto which thecontent selector 408 was dragged and dropped. In any case, it will readily be appreciated that themain selectors 404 on thecontrol device 108 mirror thewindows 400 on thedisplay device 102. A user may move his hand left or right on therow 406 of content selectors to cause selectors for additional content to scroll onto the display of the control device. The new content typically includes additional program channels or Internet content related to the theme of the programming presented on thedisplay device 102, including, for example, sports statistics related to a sporting event in one of thewindows 400. - The left side of
FIG. 5 illustrates that “throwing” one of themain selectors 404 on thecontrol device 108 “to” thedisplay device 102 causes the display device to switch to a full screen presentation 500 (shown on the right ofFIG. 5 ) of the content represented by the “thrown”main selector 404. The processor of thecontrol device 108 may infer that amain selector 404 has been “thrown” by a user dragging the main selector upwards toward the top of the control device, responsive to which thecontrol device 108 sends a command to the controller to present the associated content full screen on thedisplay device 102. - Although not shown in
FIG. 5 , commanding thedisplay device 102 into the full screen mode as described above may result in themain selectors 404 on thecontrol device 108 merging into a single large selector with the same content shown full screen on the display device. However, therow 406 ofcontent selectors 406 andcolumn 410 ofaudio selectors 412 can remain unchanged. This single large main selector in the area formerly occupied by the four main selectors may be touched to cause the display device (pursuant to a command from the control device) to revert to the quad view shown on the left side ofFIG. 5 and to also cause themain selectors 404 on thecontrol device 108 to mirror the display device views, in this case, to resume the four main selector quad view shown on the bottom left ofFIG. 4 . - Alternatively, throwing a
main selector 404 to cause the display device to enter full screen mode as described may not alter the appearance of themain selectors 404, which can remain in the quad view shown on the control device. Subsequently touching any one of themain selectors 404 on the control device may result in the control device commanding the controller to resume the quad view presentation on the display device. Or, if desired, as shown on the bottom right ofFIG. 5 , when the control device has been configured to command the display device to enter full screen mode, with the fourmain selectors 404 remaining on the control device UI, touching one of the main selectors may cause the corresponding content to be presented full screen on the control device. The content presented full screen on the display device may be the same or different than the content presented full screen on the control device, depending on whatmain selector 404 was thrown to the display device and whatmain selector 404 subsequently was touched by a user. A subsequent touch anywhere on the control device screen may cause the control device and display device to resume the layouts shown on the left side ofFIG. 5 . - To view the details of any content represented by a
content selector 408, as shown in the left side ofFIG. 6 a user may simply touch thecontent selector 408. This may cause adetail screen 600 to appear on the control device 108 (but not on the display device 102), so that a person controlling presentation on the display device by means of thecontrol device 108 can observe the detailed information about content prior to presenting the content (or information about the content) on thedisplay device 102. -
FIG. 7 is a series of screen shots on thecontrol device 108 to illustrate further optional details. Aremote control selector 702 may be presented which if touched as indicated at 704 can cause a remote-control likepresentation 706 to be presented on thecontrol device 108. Thepresentation 706 emulates a standard remote control, with alpha-numeric touch-enabled input selectors, channel and volume up/down selectors, and cursor arrow selectors as shown. Touching theselector 702 again causes the initial presentation (upper right panel inFIG. 7 ) to resume on the control device. - When a
single control device 108 is used to control bothdisplay device 102 shown inFIG. 2 , as shown at 708 inFIG. 7 main selectors 710 corresponding to the quad view of a first one of thedisplay devices 102 may be presented and used to control that display device according to principles above. Also, a smaller quad view of alternatemain selectors 712 may be presented, representing content being presented on thesecond display device 102. To enable control of thesecond display device 102 using thecontrol device 108, a user need only touch the smaller quad view of alternatemain selectors 712 as shown at 714, causing the smaller quad view of alternatemain selectors 712 to animate to anenlarged configuration 716 and the quad view of main selectors corresponding to the first one of thedisplay devices 102 to animate to become smaller in size as shown at 718. The enlarged configuration of main selectors on thecontrol device 108 appertaining to thesecond display device 102 may then be used to control the presentation on the second display device according to principles above. - The bottom two screen shots in
FIG. 7 show that tapping 720 on a content selector can cause adetail presentation 722 to be shown on the control device, showing the details of the content represented by the tapped content selector. Also, as shown at 724, instead of a column of audio selectors as described previously, respective audio on/off symbols analogous to thesymbols 414 inFIG. 4 may be presented on eachmain selector 726 and if touched cause the control device to command the controller to replace the associated audio with the touched symbol to replace the audio currently being played on the display device being controlled. - While a four screen quad view is discussed and shown, any number of windows in a multi-window arrangements may be used.
- While the particular CONTROL OF LARGE SCREEN DISPLAY USING WIRELESS PORTABLE COMPUTER INTERFACING WITH DISPLAY CONTROLLER is herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.
Claims (20)
1. A control device comprising:
at least one computer readable storage medium bearing instructions executable by a processor;
at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for:
presenting on a display of the control device a user interface (UI) having plural selectors arranged in a layout, each selector being established by a respective video feed;
receiving a first user input corresponding to a first command;
wirelessly transmitting the first command to a controller of the display device to command the controller to present plural video feeds on the display device, the video feeds on the control device being of the same content as respective video feeds on the display device;
receiving a second user input corresponding to a second command; and
wirelessly transmitting the second command to the controller to command the controller to present a single video feed full screen on the display device with the layout of the selectors on the control device remaining unchanged.
2. The control device of claim 1 , wherein the first user input is established by a user touch on any portion of a display of the control device.
3. The control device of claim 1 , wherein the second user input is established only by a user touch on a selector corresponding to the single video feed.
4. The control device of claim 1 , wherein the processor when executing the instructions is further configured for:
transmitting the first command to the controller of the display device to command the controller to present the plural video feeds on the display device in a same layout as the video feeds establishing the selectors are arranged on the display of the monitoring device.
5. The control device of claim 1 , wherein the processor when executing the instructions is further configured for:
receiving a third user input corresponding to a third command, the third user input being a drag of an image associated with a content selector and a drop of the image onto a portion of the control device, the third command commanding the controller to cause the display device to present a video corresponding to the content selector in a portion of the display device that corresponds to the portion of the control device onto which the image was dropped.
6. The control device of claim 1 , wherein the control device is established by a portable computer.
7. The control device of claim 1 , wherein the processor when executing the instructions is further configured for:
providing a visible indicator on the control device that the display device has been commanded to present the single video feed full screen on the display device.
8. System comprising:
display device configured for presenting plural video contents in respective windows of the display device;
controller configured for controlling the display device; and
control device configured for communicating commands to the controller to control presentation on the display device, the control device being configured to present a user interface (UI) having plural selectors arranged in a layout on a screen of the control device, each selector being established by a respective video feed having a first resolution, the control device being configured for causing the display device to present plural video feeds on the display device, the video feeds on the control device being of the same content as respective video feeds on the display device, the respective video feeds on the display device having second resolution higher than the first resolution, the control device being configured for causing a single video feed full screen to be presented on the display device with the layout of the selectors on the control device remaining unchanged.
9. The system of claim 8 , wherein the video feeds on the display device are received from a source and the video feeds on the control device are also received from the source.
10. The system of claim 8 , wherein the control device is configured for causing the single video feed full screen to be presented on the display device responsive to a user touch on any portion of a display of the control device.
11. The system of claim 8 , wherein the control device is configured for causing the single video feed full screen to be presented on the display device responsive to only a user touch on a selector corresponding to the single video feed.
12. The system of claim 8 , wherein the control device is further configured for:
transmitting a command to the controller of the display device to command the controller to present the plural video feeds on the display device in a same layout as the video feeds establishing the selectors are arranged on the display of the monitoring device.
13. The system of claim 8 , wherein the control device is further configured for:
receiving a third user input corresponding to a third command, the third user input being a drag of an image associated with a content selector and a drop of the image onto a portion of the control device, the third command commanding the controller to cause the display device to present a video corresponding to the content selector in a portion of the display device that corresponds to the portion of the control device onto which the image was dropped.
14. The system of claim 8 , wherein the control device is established by a portable computer.
15. The system of claim 8 , wherein the control device is further configured for:
providing a visible indicator on the control device that the display device has been commanded to present the single video feed full screen on the display device.
16. The system of claim 8 , wherein the display device is established by an ultra high definition (UHD) display.
17. The system of claim 8 , wherein the controller is established by a game console.
18. The system of claim 8 , wherein the controller is established by a personal computer (PC).
19. The system of claim 8 , wherein the control device is established by a laptop computer or tablet computer.
20. Method comprising:
sending plural video feeds having a first resolution to a control device over a computer network;
sending plural video feeds having a second resolution to a display device over the computer network, the second resolution being higher than the first, the video feeds of the first resolution being the same content as the video feeds of the second content, wherein video presentation on the display device is controlled at least in part based on user touches of the video feeds presented on the control device.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/271,156 US20150253974A1 (en) | 2014-03-07 | 2014-05-06 | Control of large screen display using wireless portable computer interfacing with display controller |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201461949545P | 2014-03-07 | 2014-03-07 | |
| US14/271,156 US20150253974A1 (en) | 2014-03-07 | 2014-05-06 | Control of large screen display using wireless portable computer interfacing with display controller |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20150253974A1 true US20150253974A1 (en) | 2015-09-10 |
Family
ID=54017392
Family Applications (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/271,156 Abandoned US20150253974A1 (en) | 2014-03-07 | 2014-05-06 | Control of large screen display using wireless portable computer interfacing with display controller |
| US14/271,282 Active 2034-07-04 US9348495B2 (en) | 2014-03-07 | 2014-05-06 | Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone |
| US14/271,685 Active 2037-07-19 US11102543B2 (en) | 2014-03-07 | 2014-05-07 | Control of large screen display using wireless portable computer to pan and zoom on large screen display |
| US15/139,642 Abandoned US20160241902A1 (en) | 2014-03-07 | 2016-04-27 | Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone |
Family Applications After (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/271,282 Active 2034-07-04 US9348495B2 (en) | 2014-03-07 | 2014-05-06 | Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone |
| US14/271,685 Active 2037-07-19 US11102543B2 (en) | 2014-03-07 | 2014-05-07 | Control of large screen display using wireless portable computer to pan and zoom on large screen display |
| US15/139,642 Abandoned US20160241902A1 (en) | 2014-03-07 | 2016-04-27 | Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone |
Country Status (1)
| Country | Link |
|---|---|
| US (4) | US20150253974A1 (en) |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160050449A1 (en) * | 2014-08-12 | 2016-02-18 | Samsung Electronics Co., Ltd. | User terminal apparatus, display apparatus, system and control method thereof |
| US20160232314A1 (en) * | 2015-02-09 | 2016-08-11 | Samsung Electronics Co., Ltd. | Mobile terminal and method of controlling medical apparatus by using the mobile terminal |
| DE102016202694A1 (en) * | 2016-02-22 | 2017-08-24 | Siemens Aktiengesellschaft | Multi-ad user interface and method for positioning content across multiple ads |
| CN107329758A (en) * | 2017-06-30 | 2017-11-07 | 武汉斗鱼网络科技有限公司 | The full frame method to set up of the page, device and user terminal |
| WO2017217924A1 (en) | 2016-06-14 | 2017-12-21 | Razer (Asia-Pacific) Pte. Ltd. | Image processing devices, methods for controlling an image processing device, and computer-readable media |
| CN108566480A (en) * | 2018-01-02 | 2018-09-21 | 京东方科技集团股份有限公司 | The control method of wearable device, apparatus and system |
| CN108881957A (en) * | 2017-11-02 | 2018-11-23 | 北京视联动力国际信息技术有限公司 | A kind of mixed method and device of multimedia file |
| CN108881927A (en) * | 2017-11-30 | 2018-11-23 | 北京视联动力国际信息技术有限公司 | A kind of video data synthetic method and device |
| CN109656654A (en) * | 2018-11-30 | 2019-04-19 | 厦门亿力吉奥信息科技有限公司 | The edit methods and computer readable storage medium of large-size screen monitors scene |
| WO2019095979A1 (en) * | 2017-11-14 | 2019-05-23 | 腾讯科技(深圳)有限公司 | Video image processing method and apparatus, and terminal |
| US20190212901A1 (en) * | 2018-01-08 | 2019-07-11 | Cisco Technology, Inc. | Manipulation of content on display surfaces via augmented reality |
| CN111316224A (en) * | 2018-03-19 | 2020-06-19 | 广州视源电子科技股份有限公司 | Data transmission device and data transmission method |
| US11069142B2 (en) * | 2019-05-31 | 2021-07-20 | Wormhole Labs, Inc. | Using live feeds to produce 1st person, spatial, and real life views |
| US11941317B2 (en) | 2021-07-26 | 2024-03-26 | Faurecia Clarion Electronics Co., Ltd. | Display controlling method |
| EP4287013A4 (en) * | 2021-01-29 | 2024-07-24 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | SCREEN PROJECTION DISPLAY METHOD AND APPARATUS, MOBILE TERMINAL, STORAGE MEDIUM AND PROGRAM PRODUCT |
Families Citing this family (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| USD751088S1 (en) * | 2013-12-30 | 2016-03-08 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
| USD750643S1 (en) * | 2013-12-30 | 2016-03-01 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
| USD751578S1 (en) * | 2013-12-30 | 2016-03-15 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
| USD786910S1 (en) * | 2015-05-25 | 2017-05-16 | Mitsubishi Electric Corporation | Display screen with graphical user interface |
| US9875012B2 (en) * | 2015-08-05 | 2018-01-23 | Sony Corporation | Media sharing between devices using drag and drop gesture |
| USD783027S1 (en) * | 2015-09-08 | 2017-04-04 | Citibank, N.A. | Display screen or portion thereof with a transitional graphical user interface of a trade viewer application |
| US10768803B2 (en) * | 2015-09-21 | 2020-09-08 | Motorola Solutions, Inc. | User interface system with active and passive display spaces |
| CN105791769B (en) * | 2016-03-11 | 2019-05-03 | 广东威创视讯科技股份有限公司 | Ultra-high-definition video display method and system for video wall |
| US20170371528A1 (en) * | 2016-06-23 | 2017-12-28 | Honeywell International Inc. | Apparatus and method for managing navigation on industrial operator console using touchscreen |
| KR20180005377A (en) * | 2016-07-06 | 2018-01-16 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same, display device and method for controlling the same |
| US10798044B1 (en) | 2016-09-01 | 2020-10-06 | Nufbee Llc | Method for enhancing text messages with pre-recorded audio clips |
| US20180184152A1 (en) * | 2016-12-23 | 2018-06-28 | Vitaly M. Kirkpatrick | Distributed wireless audio and/or video transmission |
| CN106919707B (en) * | 2017-03-10 | 2020-04-07 | 腾讯科技(深圳)有限公司 | Page display method and terminal based on H5 |
| KR102291021B1 (en) | 2017-03-17 | 2021-08-18 | 삼성전자주식회사 | Electronic device for controlling audio output and operating mehtod thereof |
| JP6926558B2 (en) * | 2017-03-17 | 2021-08-25 | ヤマハ株式会社 | Data processing unit and information processing equipment |
| USD912151S1 (en) * | 2017-05-23 | 2021-03-02 | The Address, Inc. | For sale sign |
| CN107404531A (en) * | 2017-07-27 | 2017-11-28 | 国云科技股份有限公司 | A HTML5-based cloud security control system and its implementation method |
| US11301124B2 (en) * | 2017-08-18 | 2022-04-12 | Microsoft Technology Licensing, Llc | User interface modification using preview panel |
| US11237699B2 (en) | 2017-08-18 | 2022-02-01 | Microsoft Technology Licensing, Llc | Proximal menu generation |
| CN111542805B (en) * | 2017-11-10 | 2024-05-10 | 赛斯-伊玛格标签有限责任公司 | Electronic shelf label system with zone control of display content |
| US11406892B2 (en) * | 2018-12-04 | 2022-08-09 | Sony Interactive Entertainment Inc. | Information processing apparatus |
| CN112019908A (en) * | 2019-05-31 | 2020-12-01 | 阿里巴巴集团控股有限公司 | Video playing method, device and equipment |
| CN110955477B (en) * | 2019-10-12 | 2023-04-11 | 中国平安财产保险股份有限公司 | Image self-defining method, device, equipment and storage medium |
| CN111343495A (en) * | 2020-03-12 | 2020-06-26 | 海信视像科技股份有限公司 | Display device and method for playing music in terminal |
| US11601691B2 (en) | 2020-05-04 | 2023-03-07 | Kilburn Live, Llc | Method and apparatus for providing audio and video within an acceptable delay tolerance |
| US11157160B1 (en) * | 2020-11-09 | 2021-10-26 | Dell Products, L.P. | Graphical user interface (GUI) for controlling virtual workspaces produced across information handling systems (IHSs) |
| CN113099182B (en) * | 2021-04-08 | 2022-11-22 | 西安应用光学研究所 | Multi-window real-time scaling method based on airborne parallel processing architecture |
| FR3122508A1 (en) * | 2021-04-29 | 2022-11-04 | Orange | Characterization of a user by associating a sound with an interactive element |
| EP4108197A1 (en) | 2021-06-24 | 2022-12-28 | Gradient Denervation Technologies | Systems for treating tissue |
| CN113794928B (en) * | 2021-09-14 | 2023-07-25 | Vidaa(荷兰)国际控股有限公司 | Audio playing method and display device |
Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060050090A1 (en) * | 2000-03-16 | 2006-03-09 | Kamran Ahmed | User selectable hardware zoom in a video display system |
| US20080092172A1 (en) * | 2006-09-29 | 2008-04-17 | Guo Katherine H | Method and apparatus for a zooming feature for mobile video service |
| US20100302130A1 (en) * | 2009-05-29 | 2010-12-02 | Seiko Epson Corporation | Image display system, image display device, and image display method |
| US20110231791A1 (en) * | 2010-03-19 | 2011-09-22 | Seiko Epson Corporation | Image display system, graphical user interface, and image display method |
| US20120140117A1 (en) * | 2010-10-26 | 2012-06-07 | Bby Solutions, Inc. | Two-Sided Remote Control |
| US20120182203A1 (en) * | 2009-09-28 | 2012-07-19 | Kyocera Corporation | Display system and control method |
| US20120319927A1 (en) * | 2008-12-02 | 2012-12-20 | Nvidia Corporation | Remote management of a simultaneous display of multimedia content in display devices |
| US20130147832A1 (en) * | 2011-12-07 | 2013-06-13 | Ati Technologies Ulc | Method and apparatus for remote extension display |
| US20130290847A1 (en) * | 2012-04-30 | 2013-10-31 | Paul Hooven | System and method for processing viewer interaction with video through direct database look-up |
| US20130305138A1 (en) * | 2012-05-14 | 2013-11-14 | Pacsthology Ltd. | Systems and methods for acquiring and transmitting high-resolution pathology images |
| US20140253802A1 (en) * | 2013-03-11 | 2014-09-11 | Graham Clift | Electronic displays having paired canvases |
| US20140267026A1 (en) * | 2013-03-15 | 2014-09-18 | Brigham Young University | Handheld document reading device with auxiliary display |
| US20140344736A1 (en) * | 2013-05-20 | 2014-11-20 | Citrix Systems, Inc. | Bound Based Contextual Zoom |
| US20150208103A1 (en) * | 2012-08-08 | 2015-07-23 | National University Of Singapore | System and Method for Enabling User Control of Live Video Stream(s) |
| US20150213776A1 (en) * | 2014-01-24 | 2015-07-30 | Nvidia Corporation | Computing system and method for automatically making a display configuration persistent |
Family Cites Families (174)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4090504A (en) | 1976-06-21 | 1978-05-23 | Yehuda Nathan | Portable temperature and pulse monitor |
| US4546349A (en) | 1981-09-29 | 1985-10-08 | Sperry Corporation | Local zoom for raster scan displays |
| JP2609744B2 (en) | 1989-07-14 | 1997-05-14 | 株式会社日立製作所 | Image display method and image display device |
| JP3686432B2 (en) | 1993-08-13 | 2005-08-24 | 富士通株式会社 | Edit screen display controller |
| US5901178A (en) | 1996-02-26 | 1999-05-04 | Solana Technology Development Corporation | Post-compression hidden data transport for video |
| US6400378B1 (en) | 1997-09-26 | 2002-06-04 | Sony Corporation | Home movie maker |
| JP3554172B2 (en) | 1998-01-09 | 2004-08-18 | キヤノン株式会社 | Radiography equipment |
| US20150297949A1 (en) | 2007-06-12 | 2015-10-22 | Intheplay, Inc. | Automatic sports broadcasting system |
| US6400852B1 (en) | 1998-12-23 | 2002-06-04 | Luxsonor Semiconductors, Inc. | Arbitrary zoom “on -the -fly” |
| AU2059801A (en) | 1999-12-03 | 2001-06-12 | Ourworld Live, Inc. | Consumer access systems and methods for providing same |
| IL135150A0 (en) | 2000-03-17 | 2001-05-20 | Avner Geller | A method and a system for secured identification of user's identity |
| KR200227883Y1 (en) | 2000-10-05 | 2001-06-15 | 김종해 | The locking & unlocking apparatus for security file cabinet door. |
| US7542895B2 (en) | 2000-10-06 | 2009-06-02 | International Business Machines Corporation | Front of screen, user interface, and national language support by downloading bitmaps from a PC to a companion device |
| US6931656B1 (en) | 2000-10-11 | 2005-08-16 | Koninklijke Philips Electronics N.V. | Virtual creature displayed on a television |
| US6717752B2 (en) | 2000-11-14 | 2004-04-06 | Pentax Corporation | Image search device |
| US6930705B2 (en) | 2000-11-14 | 2005-08-16 | Pentax Corporation | Image search device |
| US6630963B1 (en) | 2001-01-23 | 2003-10-07 | Digeo, Inc. | Synchronizing a video program from a television broadcast with a secondary audio program |
| US7219309B2 (en) | 2001-05-02 | 2007-05-15 | Bitstream Inc. | Innovations for the display of web pages |
| JP4639010B2 (en) | 2001-08-23 | 2011-02-23 | Tool株式会社 | Large-scale graphic data high-speed drawing method and apparatus |
| JP2003153739A (en) | 2001-09-05 | 2003-05-27 | Fuji Photo Film Co Ltd | Makeup mirror device, and makeup method |
| US7631277B1 (en) | 2001-12-14 | 2009-12-08 | Apple Inc. | System and method for integrating media objects |
| SE0200953D0 (en) | 2002-03-27 | 2002-03-27 | Ericsson Telefon Ab L M | A method and apparatus for exchanging data in a mobile network |
| US7987491B2 (en) | 2002-05-10 | 2011-07-26 | Richard Reisman | Method and apparatus for browsing using alternative linkbases |
| US20150135206A1 (en) | 2002-05-10 | 2015-05-14 | Convergent Media Solutions Llc | Method and apparatus for browsing using alternative linkbases |
| US8549574B2 (en) | 2002-12-10 | 2013-10-01 | Ol2, Inc. | Method of combining linear content and interactive content compressed together as streaming interactive video |
| JP4516957B2 (en) | 2003-01-25 | 2010-08-04 | パーデュー リサーチ ファンデーション | Method, system and data structure for searching for 3D objects |
| JP4296032B2 (en) | 2003-05-13 | 2009-07-15 | 富士フイルム株式会社 | Image processing device |
| AU2004246683B2 (en) | 2003-06-02 | 2008-12-18 | Disney Enterprises, Inc. | System and method of programmatic window control for consumer video players |
| JP3944122B2 (en) | 2003-06-05 | 2007-07-11 | 株式会社東芝 | Information recording medium, information recording method, information recording apparatus, information reproducing method, and information reproducing apparatus |
| US7077271B1 (en) | 2003-06-27 | 2006-07-18 | Van Acosta Hamilton | Secondary screen for concrete pump |
| JP2005070898A (en) | 2003-08-20 | 2005-03-17 | Toshiba Corp | Information processing apparatus and display control method |
| US7312764B2 (en) | 2003-09-26 | 2007-12-25 | The General Electric Company | Methods and apparatus for displaying images on mixed monitor displays |
| JP2005244931A (en) | 2004-01-26 | 2005-09-08 | Seiko Epson Corp | Multi-screen video playback system |
| JP4211672B2 (en) | 2004-04-28 | 2009-01-21 | ヤマハ株式会社 | Performance data creation device and program |
| KR101138093B1 (en) | 2004-08-31 | 2012-04-24 | 파나소닉 주식회사 | Moving image encoding method and apparatus |
| US8667017B1 (en) | 2005-01-21 | 2014-03-04 | Invensys Systems, Inc. | Method for portal-based collaborative process management and information access |
| JP4780980B2 (en) | 2005-03-11 | 2011-09-28 | 富士フイルム株式会社 | Endoscope device |
| JP4081772B2 (en) | 2005-08-25 | 2008-04-30 | ソニー株式会社 | REPRODUCTION DEVICE, REPRODUCTION METHOD, PROGRAM, AND PROGRAM STORAGE MEDIUM |
| US7627890B2 (en) | 2006-02-21 | 2009-12-01 | At&T Intellectual Property, I,L.P. | Methods, systems, and computer program products for providing content synchronization or control among one or more devices |
| US20070209009A1 (en) | 2006-03-02 | 2007-09-06 | Inventec Corporation | Display with settings changed according to user's ID and method thereof |
| WO2007103871A2 (en) | 2006-03-06 | 2007-09-13 | George Geeyaw She | System and method for establishing and maintaining synchronization of isochronous audio and video information streams in wireless multimedia applications |
| EP1879382B1 (en) | 2006-07-10 | 2017-09-06 | Samsung Electronics Co., Ltd. | Multi-screen display apparatus and method for digital broadcast receiver |
| US9554061B1 (en) | 2006-12-15 | 2017-01-24 | Proctor Consulting LLP | Smart hub |
| US8073211B2 (en) | 2007-02-23 | 2011-12-06 | General Electric Company | Method and apparatus for generating variable resolution medical images |
| JP5078417B2 (en) | 2007-04-17 | 2012-11-21 | キヤノン株式会社 | Signal processing apparatus and signal processing method |
| US8054382B2 (en) | 2007-05-21 | 2011-11-08 | International Business Machines Corporation | Apparatus, method and system for synchronizing a common broadcast signal among multiple television units |
| KR100837345B1 (en) | 2007-06-25 | 2008-06-12 | (주)엠앤소프트 | Method of Displaying the Magnification of Intersection in Vehicle Terminal |
| US9965067B2 (en) * | 2007-09-19 | 2018-05-08 | T1V, Inc. | Multimedia, multiuser system and associated methods |
| CN101430620A (en) | 2007-11-06 | 2009-05-13 | 英华达股份有限公司 | Notebook computer with multi-point touch screen |
| JP5188148B2 (en) | 2007-11-09 | 2013-04-24 | キヤノン株式会社 | Display device, method and program |
| US8223173B2 (en) * | 2008-04-09 | 2012-07-17 | Hewlett-Packard Development Company, L.P. | Electronic device having improved user interface |
| US8654988B2 (en) | 2008-05-05 | 2014-02-18 | Qualcomm Incorporated | Synchronization of signals for multiple data sinks |
| USD595288S1 (en) | 2008-06-20 | 2009-06-30 | Celio Technology Corporation | Cellular telephone companion with display and keyboard |
| EP2293583A1 (en) | 2008-06-24 | 2011-03-09 | Panasonic Corporation | Recording medium, reproducing device, integrated circuit, reproducing method, and program |
| BRPI0917864A2 (en) | 2008-08-15 | 2015-11-24 | Univ Brown | apparatus and method for estimating body shape |
| SG172182A1 (en) | 2008-09-30 | 2011-07-28 | Panasonic Corp | Reproduction device, recording medium, and integrated circuit |
| US8493282B2 (en) | 2008-11-17 | 2013-07-23 | Google Inc. | Handheld device with secondary screen for soft key descriptors |
| WO2010077564A1 (en) | 2008-12-08 | 2010-07-08 | Analog Devices Inc. | Multimedia switching over wired or wireless connections in a distributed environment |
| WO2010071283A1 (en) | 2008-12-18 | 2010-06-24 | (주)엘지전자 | Digital broadcasting reception method capable of displaying stereoscopic image, and digital broadcasting reception apparatus using same |
| JP5491414B2 (en) | 2008-12-26 | 2014-05-14 | パナソニック株式会社 | Recording medium, reproducing apparatus, and integrated circuit |
| WO2010076846A1 (en) | 2008-12-29 | 2010-07-08 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
| JP4569935B2 (en) | 2009-02-04 | 2010-10-27 | パナソニック株式会社 | Recording medium, reproducing apparatus, and integrated circuit |
| WO2010089995A1 (en) | 2009-02-04 | 2010-08-12 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
| JP5438333B2 (en) * | 2009-02-05 | 2014-03-12 | キヤノン株式会社 | Display control apparatus and display control method |
| CN102124748A (en) | 2009-02-27 | 2011-07-13 | 松下电器产业株式会社 | Recording medium, playback device and integrated circuit |
| WO2010100875A1 (en) | 2009-03-02 | 2010-09-10 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
| CA2714859C (en) | 2009-03-30 | 2016-10-11 | Panasonic Corporation | Recording medium, playback device, and integrated circuit |
| WO2010113454A1 (en) | 2009-03-31 | 2010-10-07 | パナソニック株式会社 | Recording medium, reproducing device, and integrated circuit |
| RU2534936C2 (en) | 2009-04-09 | 2014-12-10 | Телефонактиеболагет Лм Эрикссон (Пабл) | Multimedia container file management |
| US8319801B2 (en) | 2009-05-08 | 2012-11-27 | International Business Machines Corporation | Magnifying content on a graphical display |
| EP2434769B1 (en) | 2009-05-19 | 2016-08-10 | Panasonic Intellectual Property Management Co., Ltd. | Recording method and playback method |
| US8290338B2 (en) | 2009-05-27 | 2012-10-16 | Panasonic Corporation | Recording medium, playback device, encoding device, integrated circuit, and playback output device |
| RU2541128C2 (en) | 2009-07-10 | 2015-02-10 | Панасоник Корпорэйшн | Recording medium, playback device and integrated circuit |
| US8270807B2 (en) | 2009-07-13 | 2012-09-18 | Panasonic Corporation | Recording medium, playback device, and integrated circuit |
| US8610926B2 (en) | 2009-07-24 | 2013-12-17 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method and program for determining suitability of printing content data displayed on a display apparatus |
| US20110052144A1 (en) | 2009-09-01 | 2011-03-03 | 2Cimple, Inc. | System and Method for Integrating Interactive Call-To-Action, Contextual Applications with Videos |
| US10587833B2 (en) | 2009-09-16 | 2020-03-10 | Disney Enterprises, Inc. | System and method for automated network search and companion display of result relating to audio-video metadata |
| US8164619B2 (en) | 2009-09-25 | 2012-04-24 | Panasonic Corporation | Recording medium, playback device, and integrated circuit |
| KR101714781B1 (en) | 2009-11-17 | 2017-03-22 | 엘지전자 주식회사 | Method for playing contents |
| CN102334338B (en) | 2009-12-28 | 2015-04-22 | 松下电器产业株式会社 | Display device and method, transmission device and method, and reception device and method |
| WO2011080907A1 (en) | 2009-12-28 | 2011-07-07 | パナソニック株式会社 | Display apparatus and method, recording medium, transmission apparatus and method, and playback apparatus and method |
| US8964013B2 (en) | 2009-12-31 | 2015-02-24 | Broadcom Corporation | Display with elastic light manipulator |
| US20110183654A1 (en) | 2010-01-25 | 2011-07-28 | Brian Lanier | Concurrent Use of Multiple User Interface Devices |
| JP6013920B2 (en) | 2010-02-15 | 2016-10-25 | トムソン ライセンシングThomson Licensing | Apparatus and method for processing video content |
| KR20110096494A (en) | 2010-02-22 | 2011-08-30 | 엘지전자 주식회사 | Electronic device and stereoscopic image playback method |
| US9804727B2 (en) | 2010-03-09 | 2017-10-31 | Freedom Scientific, Inc. | Flexible display of visual content on multiple display devices |
| US20130002821A1 (en) | 2010-03-24 | 2013-01-03 | Panasonic Corporation | Video processing device |
| US9213480B2 (en) * | 2010-04-08 | 2015-12-15 | Nokia Technologies Oy | Method, apparatus and computer program product for joining the displays of multiple devices |
| EP2579138A4 (en) | 2010-05-28 | 2014-08-06 | Lg Electronics Inc | CONTENT CONTROL METHOD AND CONTENT READER USING THE SAME |
| US8331760B2 (en) | 2010-06-02 | 2012-12-11 | Microsoft Corporation | Adaptive video zoom |
| JP2011259332A (en) * | 2010-06-11 | 2011-12-22 | Sony Corp | Image processing device and method |
| US8763060B2 (en) | 2010-07-11 | 2014-06-24 | Apple Inc. | System and method for delivering companion content |
| JPWO2012017643A1 (en) | 2010-08-06 | 2013-10-03 | パナソニック株式会社 | Encoding method, display device, and decoding method |
| EP2605514B1 (en) | 2010-08-09 | 2017-07-12 | Panasonic Corporation | Image encoding method, image decoding method, image encoding device, and image decoding device |
| US20120044324A1 (en) | 2010-08-23 | 2012-02-23 | Lg Electronics Inc. | Method for providing 3d video in a 3dtv |
| US20120075436A1 (en) | 2010-09-24 | 2012-03-29 | Qualcomm Incorporated | Coding stereo video data |
| US9078082B2 (en) | 2010-09-24 | 2015-07-07 | Amazon Technologies, Inc. | Interacting with cloud-based applications using unrelated devices |
| JP5500649B2 (en) | 2010-09-29 | 2014-05-21 | Kddi株式会社 | Video distribution server |
| US9131256B2 (en) | 2010-09-30 | 2015-09-08 | Verizon Patent And Licensing Inc. | Method and apparatus for synchronizing content playback |
| MX2013004068A (en) | 2010-10-25 | 2013-05-22 | Panasonic Corp | Encoding method, display device, decoding method. |
| DE112011103642T5 (en) | 2010-11-02 | 2013-09-19 | Lg Electronics Inc. | Method for transmitting / receiving media content and device for transmitting / receiving using this |
| US10303357B2 (en) * | 2010-11-19 | 2019-05-28 | TIVO SOLUTIONS lNC. | Flick to send or display content |
| EP2521377A1 (en) * | 2011-05-06 | 2012-11-07 | Jacoti BVBA | Personal communication device with hearing support and method for providing the same |
| US9137539B2 (en) | 2010-12-22 | 2015-09-15 | Panasonic Corporation | Image coding apparatus, image decoding apparatus, image coding method, and image decoding method |
| JP6004271B2 (en) | 2011-01-12 | 2016-10-05 | サン パテント トラスト | Image encoding method, image decoding method, image encoding device, and image decoding device |
| US20120198386A1 (en) * | 2011-01-31 | 2012-08-02 | Nokia Corporation | Causing display of thumbnail images |
| JP5906462B2 (en) | 2011-02-16 | 2016-04-20 | パナソニックIpマネジメント株式会社 | Video encoding apparatus, video encoding method, video encoding program, video playback apparatus, video playback method, and video playback program |
| MX2013009234A (en) | 2011-02-18 | 2013-08-29 | Sony Corp | Image processing device and image processing method. |
| JP2012186781A (en) | 2011-02-18 | 2012-09-27 | Sony Corp | Image processing device and image processing method |
| US9661301B2 (en) | 2011-02-18 | 2017-05-23 | Sony Corporation | Image processing device and image processing method |
| US20120227098A1 (en) | 2011-03-03 | 2012-09-06 | Microsoft Corporation | Sharing user id between operating system and application |
| WO2012121709A1 (en) | 2011-03-08 | 2012-09-13 | Empire Technology Development Llc | Output of video content |
| CN103430556A (en) | 2011-03-18 | 2013-12-04 | 松下电器产业株式会社 | Display device, 3D glasses and 3D video audio-visual system |
| KR101770206B1 (en) * | 2011-04-06 | 2017-08-22 | 엘지전자 주식회사 | Mobile terminal and user interface providing method using the same |
| US9398322B2 (en) | 2011-04-27 | 2016-07-19 | Time Warner Cable Enterprises Llc | Multi-lingual audio streaming |
| JP5908894B2 (en) | 2011-04-28 | 2016-04-26 | パナソニック株式会社 | Recording medium, reproducing apparatus, and recording apparatus |
| US20140036033A1 (en) | 2011-04-28 | 2014-02-06 | Sony Corporation | Image processing device and image processing method |
| US9392246B2 (en) | 2011-04-28 | 2016-07-12 | Panasonic Intellectual Property Management Co., Ltd. | Recording medium, playback device, recording device, encoding method, and decoding method related to higher image quality |
| EP2695049A1 (en) | 2011-05-10 | 2014-02-12 | NDS Limited | Adaptive presentation of content |
| US8948567B2 (en) | 2011-06-20 | 2015-02-03 | Microsoft Technology Licensing, Llc | Companion timeline with timeline events |
| WO2013001748A1 (en) | 2011-06-29 | 2013-01-03 | パナソニック株式会社 | Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding/decoding device |
| WO2013001749A1 (en) | 2011-06-29 | 2013-01-03 | パナソニック株式会社 | Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding/decoding device |
| US20130009997A1 (en) | 2011-07-05 | 2013-01-10 | Research In Motion Limited | Pinch-to-zoom video apparatus and associated method |
| EP2744196A1 (en) | 2011-08-11 | 2014-06-18 | Panasonic Corporation | Hybrid broadcast and communication system, data generation device, and receiver |
| CN103125123B (en) | 2011-08-11 | 2017-04-26 | 松下知识产权经营株式会社 | Playback device, playback method, integrated circuit, broadcasting system, and broadcasting method |
| US8989556B2 (en) | 2011-08-24 | 2015-03-24 | Panasonic Intellectual Property Management Co., Ltd. | Recording medium, playback device, recording device, and recording method |
| US10165267B2 (en) | 2011-08-30 | 2018-12-25 | Intel Corporation | Multiview video coding schemes |
| JP6088127B2 (en) * | 2011-10-13 | 2017-03-01 | セイコーエプソン株式会社 | Display device, display device control method, and program |
| CA2855154A1 (en) | 2011-11-09 | 2013-05-16 | Blackberry Limited | Touch-sensitive display method and apparatus |
| EP2645724A4 (en) | 2011-11-11 | 2014-08-06 | Sony Corp | TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEIVING APPARATUS, AND RECEIVING METHOD |
| US9270718B2 (en) | 2011-11-25 | 2016-02-23 | Harry E Emerson, III | Internet streaming and the presentation of dynamic content |
| JP2013152693A (en) * | 2011-12-27 | 2013-08-08 | Nintendo Co Ltd | Information processing program, information processing device, image display method, and image display system |
| EP2611152A3 (en) | 2011-12-28 | 2014-10-15 | Samsung Electronics Co., Ltd. | Display apparatus, image processing system, display method and imaging processing thereof |
| US8978075B1 (en) | 2012-01-18 | 2015-03-10 | Coincident.Tv, Inc. | Associating media using metadata and controlling multiple-device synchronization and rendering |
| US9204099B2 (en) | 2012-02-01 | 2015-12-01 | Magor Communications Corporation | Videoconferencing system providing virtual physical context |
| WO2013122385A1 (en) | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method |
| KR101917174B1 (en) | 2012-02-24 | 2018-11-09 | 삼성전자주식회사 | Method for transmitting stream between electronic devices and electronic device for the method thereof |
| JP5832339B2 (en) | 2012-03-04 | 2015-12-16 | アルパイン株式会社 | Scale display method and apparatus for scaling operation |
| US9716856B2 (en) | 2012-03-07 | 2017-07-25 | Echostar Technologies L.L.C. | Adaptive bit rate transcode and caching for off air television programming delivery |
| US20130262997A1 (en) | 2012-03-27 | 2013-10-03 | Roku, Inc. | Method and Apparatus for Displaying Information on a Secondary Screen |
| US20130290848A1 (en) | 2012-04-27 | 2013-10-31 | Mobitv, Inc. | Connected multi-screen video |
| JP6361847B2 (en) | 2012-05-02 | 2018-07-25 | サン パテント トラスト | Image encoding method, image decoding method, image encoding device, and image decoding device |
| EP3796651A1 (en) | 2012-05-09 | 2021-03-24 | Sun Patent Trust | Method of performing motion vector prediction, encoding and decoding methods, and apparatuses thereof |
| US8708224B2 (en) * | 2012-05-25 | 2014-04-29 | Wesley John Boudville | Mobile device audio from an external video display using a barcode |
| JP6082190B2 (en) | 2012-05-31 | 2017-02-15 | 任天堂株式会社 | Program, information processing apparatus, image display method, and display system |
| KR102073601B1 (en) * | 2012-07-25 | 2020-02-06 | 삼성전자주식회사 | User terminal apparatus and control method thereof |
| JP6103857B2 (en) | 2012-08-28 | 2017-03-29 | キヤノン株式会社 | Subject information acquisition apparatus, display method, and program |
| JP6025456B2 (en) | 2012-08-28 | 2016-11-16 | キヤノン株式会社 | Subject information acquisition apparatus, display method, and program |
| US20150089372A1 (en) | 2012-09-18 | 2015-03-26 | General Instrument Corporation | Method of user interaction for showing and interacting with friend statsu on timeline |
| KR101936075B1 (en) * | 2012-09-21 | 2019-01-08 | 삼성전자주식회사 | Method for displaying data of a dispay apparatus using a mobile communication terminal and the apparatuses |
| US20140096167A1 (en) | 2012-09-28 | 2014-04-03 | Vringo Labs, Inc. | Video reaction group messaging with group viewing |
| US20140098715A1 (en) | 2012-10-09 | 2014-04-10 | Tv Ears, Inc. | System for streaming audio to a mobile device using voice over internet protocol |
| US20140104137A1 (en) * | 2012-10-16 | 2014-04-17 | Google Inc. | Systems and methods for indirectly associating logical and physical display content |
| US9183558B2 (en) | 2012-11-05 | 2015-11-10 | Disney Enterprises, Inc. | Audio/video companion screen system and method |
| US9513795B2 (en) * | 2012-11-29 | 2016-12-06 | Blackberry Limited | System and method for graphic object management in a large-display area computing device |
| WO2014100374A2 (en) | 2012-12-19 | 2014-06-26 | Rabbit, Inc. | Method and system for content sharing and discovery |
| US20140282069A1 (en) | 2013-03-14 | 2014-09-18 | Maz Digital Inc. | System and Method of Storing, Editing and Sharing Selected Regions of Digital Content |
| WO2014175919A1 (en) | 2013-04-26 | 2014-10-30 | Intel IP Corporation | Shared spectrum reassignment in a spectrum sharing context |
| US9712266B2 (en) | 2013-05-21 | 2017-07-18 | Apple Inc. | Synchronization of multi-channel audio communicated over bluetooth low energy |
| US9100687B2 (en) | 2013-05-31 | 2015-08-04 | Sonic Ip, Inc. | Playback synchronization across playback devices |
| WO2014204227A1 (en) * | 2013-06-19 | 2014-12-24 | 엘지전자 주식회사 | Signal transmitting and receiving device and method of controlling said device |
| US10721530B2 (en) | 2013-07-29 | 2020-07-21 | Koninklijke Kpn N.V. | Providing tile video streams to a client |
| US9347789B2 (en) | 2013-09-18 | 2016-05-24 | Raymond Halsey Briant | Application and device to memorialize and share events geographically |
| US9644983B2 (en) | 2013-10-15 | 2017-05-09 | Apple Inc. | Simplified audio navigation instructions |
| US9210204B2 (en) | 2013-10-31 | 2015-12-08 | At&T Intellectual Property I, Lp | Synchronizing media presentation at multiple devices |
| US20150124171A1 (en) * | 2013-11-05 | 2015-05-07 | LiveStage°, Inc. | Multiple vantage point viewing platform and user interface |
| US9271048B2 (en) | 2013-12-13 | 2016-02-23 | The Directv Group, Inc. | Systems and methods for immersive viewing experience |
| US9386275B2 (en) | 2014-01-06 | 2016-07-05 | Intel IP Corporation | Interactive video conferencing |
| US20150242179A1 (en) * | 2014-02-21 | 2015-08-27 | Smart Technologies Ulc | Augmented peripheral content using mobile device |
| EP3162074A1 (en) | 2014-06-27 | 2017-05-03 | Koninklijke KPN N.V. | Determining a region of interest on the basis of a hevc-tiled video stream |
| US9384745B2 (en) | 2014-08-12 | 2016-07-05 | Nxp B.V. | Article of manufacture, system and computer-readable storage medium for processing audio signals |
| US10021346B2 (en) | 2014-12-05 | 2018-07-10 | Intel IP Corporation | Interactive video conferencing |
| US9743368B2 (en) | 2015-04-10 | 2017-08-22 | Time Warner Cable Enterprises Llc | Methods and apparatus for synchronized viewing experience across multiple devices |
| WO2018049321A1 (en) | 2016-09-12 | 2018-03-15 | Vid Scale, Inc. | Method and systems for displaying a portion of a video stream with partial zoom ratios |
-
2014
- 2014-05-06 US US14/271,156 patent/US20150253974A1/en not_active Abandoned
- 2014-05-06 US US14/271,282 patent/US9348495B2/en active Active
- 2014-05-07 US US14/271,685 patent/US11102543B2/en active Active
-
2016
- 2016-04-27 US US15/139,642 patent/US20160241902A1/en not_active Abandoned
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060050090A1 (en) * | 2000-03-16 | 2006-03-09 | Kamran Ahmed | User selectable hardware zoom in a video display system |
| US20080092172A1 (en) * | 2006-09-29 | 2008-04-17 | Guo Katherine H | Method and apparatus for a zooming feature for mobile video service |
| US20120319927A1 (en) * | 2008-12-02 | 2012-12-20 | Nvidia Corporation | Remote management of a simultaneous display of multimedia content in display devices |
| US20100302130A1 (en) * | 2009-05-29 | 2010-12-02 | Seiko Epson Corporation | Image display system, image display device, and image display method |
| US20120182203A1 (en) * | 2009-09-28 | 2012-07-19 | Kyocera Corporation | Display system and control method |
| US20110231791A1 (en) * | 2010-03-19 | 2011-09-22 | Seiko Epson Corporation | Image display system, graphical user interface, and image display method |
| US20120140117A1 (en) * | 2010-10-26 | 2012-06-07 | Bby Solutions, Inc. | Two-Sided Remote Control |
| US20130147832A1 (en) * | 2011-12-07 | 2013-06-13 | Ati Technologies Ulc | Method and apparatus for remote extension display |
| US20130290847A1 (en) * | 2012-04-30 | 2013-10-31 | Paul Hooven | System and method for processing viewer interaction with video through direct database look-up |
| US20130305138A1 (en) * | 2012-05-14 | 2013-11-14 | Pacsthology Ltd. | Systems and methods for acquiring and transmitting high-resolution pathology images |
| US20150208103A1 (en) * | 2012-08-08 | 2015-07-23 | National University Of Singapore | System and Method for Enabling User Control of Live Video Stream(s) |
| US20140253802A1 (en) * | 2013-03-11 | 2014-09-11 | Graham Clift | Electronic displays having paired canvases |
| US20140267026A1 (en) * | 2013-03-15 | 2014-09-18 | Brigham Young University | Handheld document reading device with auxiliary display |
| US20140344736A1 (en) * | 2013-05-20 | 2014-11-20 | Citrix Systems, Inc. | Bound Based Contextual Zoom |
| US20150213776A1 (en) * | 2014-01-24 | 2015-07-30 | Nvidia Corporation | Computing system and method for automatically making a display configuration persistent |
Non-Patent Citations (3)
| Title |
|---|
| Affinity Labs of Texas, LLC v. Amazon.com, CAFC Appeal No.2015-2080 * |
| Guo US 20080092172 A1, hereinafter * |
| Lynch US 20130236158 A1, hereinafter * |
Cited By (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160050449A1 (en) * | 2014-08-12 | 2016-02-18 | Samsung Electronics Co., Ltd. | User terminal apparatus, display apparatus, system and control method thereof |
| US10928977B2 (en) * | 2015-02-09 | 2021-02-23 | Samsung Electronics Co., Ltd. | Mobile terminal and method of controlling medical apparatus by using the mobile terminal |
| US20160232314A1 (en) * | 2015-02-09 | 2016-08-11 | Samsung Electronics Co., Ltd. | Mobile terminal and method of controlling medical apparatus by using the mobile terminal |
| DE102016202694A1 (en) * | 2016-02-22 | 2017-08-24 | Siemens Aktiengesellschaft | Multi-ad user interface and method for positioning content across multiple ads |
| WO2017217924A1 (en) | 2016-06-14 | 2017-12-21 | Razer (Asia-Pacific) Pte. Ltd. | Image processing devices, methods for controlling an image processing device, and computer-readable media |
| AU2016412141B2 (en) * | 2016-06-14 | 2022-03-03 | Razer (Asia-Pacific) Pte. Ltd. | Image processing devices, methods for controlling an image processing device, and computer-readable media |
| US11222611B2 (en) | 2016-06-14 | 2022-01-11 | Razer (Asia-Pacific) Pte. Ltd. | Image processing devices, methods for controlling an image processing device, and computer-readable media |
| EP3469547A4 (en) * | 2016-06-14 | 2019-05-15 | Razer (Asia-Pacific) Pte Ltd. | IMAGE PROCESSING DEVICES, METHOD FOR CONTROLLING IMAGE PROCESSING DEVICE, AND COMPUTER-READABLE MEDIUM |
| CN107329758A (en) * | 2017-06-30 | 2017-11-07 | 武汉斗鱼网络科技有限公司 | The full frame method to set up of the page, device and user terminal |
| CN108881957A (en) * | 2017-11-02 | 2018-11-23 | 北京视联动力国际信息技术有限公司 | A kind of mixed method and device of multimedia file |
| US11140339B2 (en) | 2017-11-14 | 2021-10-05 | Tencent Technology (Shenzhen) Company Limited | Video image processing method, apparatus and terminal |
| WO2019095979A1 (en) * | 2017-11-14 | 2019-05-23 | 腾讯科技(深圳)有限公司 | Video image processing method and apparatus, and terminal |
| CN108881927A (en) * | 2017-11-30 | 2018-11-23 | 北京视联动力国际信息技术有限公司 | A kind of video data synthetic method and device |
| CN108566480A (en) * | 2018-01-02 | 2018-09-21 | 京东方科技集团股份有限公司 | The control method of wearable device, apparatus and system |
| US20190212901A1 (en) * | 2018-01-08 | 2019-07-11 | Cisco Technology, Inc. | Manipulation of content on display surfaces via augmented reality |
| CN111316224A (en) * | 2018-03-19 | 2020-06-19 | 广州视源电子科技股份有限公司 | Data transmission device and data transmission method |
| CN109656654A (en) * | 2018-11-30 | 2019-04-19 | 厦门亿力吉奥信息科技有限公司 | The edit methods and computer readable storage medium of large-size screen monitors scene |
| US11069142B2 (en) * | 2019-05-31 | 2021-07-20 | Wormhole Labs, Inc. | Using live feeds to produce 1st person, spatial, and real life views |
| US20210327152A1 (en) * | 2019-05-31 | 2021-10-21 | Wormhole Labs, Inc. | Multi-feed context enhanced imaging |
| US11663788B2 (en) * | 2019-05-31 | 2023-05-30 | Wormhole Labs, Inc. | Multi-feed context enhanced imaging |
| EP4287013A4 (en) * | 2021-01-29 | 2024-07-24 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | SCREEN PROJECTION DISPLAY METHOD AND APPARATUS, MOBILE TERMINAL, STORAGE MEDIUM AND PROGRAM PRODUCT |
| US12474886B2 (en) | 2021-01-29 | 2025-11-18 | Guangdong Oppo Mobile Telecommunications Corp., Ltd,. | Screen-projection displaying method, apparatus, mobile terminal, and program product |
| US11941317B2 (en) | 2021-07-26 | 2024-03-26 | Faurecia Clarion Electronics Co., Ltd. | Display controlling method |
Also Published As
| Publication number | Publication date |
|---|---|
| US20150256895A1 (en) | 2015-09-10 |
| US9348495B2 (en) | 2016-05-24 |
| US20160241902A1 (en) | 2016-08-18 |
| US20150256592A1 (en) | 2015-09-10 |
| US11102543B2 (en) | 2021-08-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11102543B2 (en) | Control of large screen display using wireless portable computer to pan and zoom on large screen display | |
| US10009658B2 (en) | Multiview TV template creation and display layout modification | |
| JP6792214B2 (en) | Live interactive event display based on notification profile for display devices | |
| US20160227280A1 (en) | Content that reacts to viewers | |
| US9344766B2 (en) | User assigned channel numbering for content from multiple input source types | |
| US9478246B2 (en) | Providing audio video content during playback pause | |
| US20170311040A1 (en) | Social network screening for time-shifted content viewing | |
| CA2957181C (en) | Scene-by-scene plot context for cognitively impaired | |
| US9875694B2 (en) | Smoothing brightness transition during channel change | |
| US9875012B2 (en) | Media sharing between devices using drag and drop gesture | |
| US10915945B2 (en) | Method and apparatuses for intelligent TV startup based on consumer behavior and real time content availability | |
| US20160098180A1 (en) | Presentation of enlarged content on companion display device | |
| US10845954B2 (en) | Presenting audio video display options as list or matrix | |
| JP7462069B2 (en) | User selection of virtual camera positions for generating video using composite input from multiple cameras | |
| US20160216863A1 (en) | Corkscrew user interface linking content and curators | |
| US10373358B2 (en) | Edge user interface for augmenting camera viewfinder with information | |
| US20150332327A1 (en) | Retail store audio video feature demonstration system | |
| US10582264B2 (en) | Display expansion from featured applications section of android TV or other mosaic tiled menu | |
| US20210129033A1 (en) | Spectator feedback to game play |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YOUNG, DAVID ANDREW;LE, LOUIS;RICHMAN, STEVEN MARTIN;SIGNING DATES FROM 20140415 TO 20140505;REEL/FRAME:032833/0809 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |