[go: up one dir, main page]

US20150279037A1 - System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis - Google Patents

System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis Download PDF

Info

Publication number
US20150279037A1
US20150279037A1 US14/595,203 US201514595203A US2015279037A1 US 20150279037 A1 US20150279037 A1 US 20150279037A1 US 201514595203 A US201514595203 A US 201514595203A US 2015279037 A1 US2015279037 A1 US 2015279037A1
Authority
US
United States
Prior art keywords
displays
images
ones
display
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/595,203
Inventor
Timothy Griffin
Adam Ryan McDaniel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Userful Corp
Original Assignee
Userful Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Userful Corp filed Critical Userful Corp
Priority to US14/595,203 priority Critical patent/US20150279037A1/en
Publication of US20150279037A1 publication Critical patent/US20150279037A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • G06T7/0024
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/13336Combining plural substrates to produce large-area displays, e.g. tiled displays
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/006Details of the interface to the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/022Centralised management of display operation, e.g. in a server instead of locally
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/042Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/06Consumer Electronics Control, i.e. control of another device by a display or vice versa
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information

Definitions

  • Video-wall might be comprised of a 3 by 3 array of nine monitor, each monitor simultaneously displaying a segment of a single image, thereby creating the appearance of a single large display comprised of rectangular portions.
  • the present invention relates generally to improving the setup and operation of large displays and particularly to network addressable video-wall displays.
  • the present invention relates generally to improving the setup and operation of video-wall displays and particularly to network addressable displays.
  • a video-wall display system is a method to overcome the costs of manufacturing and installing very large displays, by assembling a large display using multiple smaller displays arranged and working together. By dividing a single image into several sub-images and displaying the sub-images on an appropriately arranged array of display devices a larger display with higher resolution can be created.
  • the set-up of the output displays is critical and their fine tuning can be laborious. Informing the server of the initial positioning of each display (so that the image segments are sent to the appropriate displays); the precise cropping of each of the sub-images (to allow the eye to interpret continuity of the total image across the bezels of the displays where no image can appear); and the adjustment of the color of the sub-segments of the image to provide equal luminosity, color and intensity/brightness ranges across the whole array of displays within the video-wall, are all essential to providing the optimal viewing experience. With conventional approaches to video-wall setup these tasks can be laborious. This invention offers methods of automating the setup process to improve the ease and speed of video-wall setup.
  • a video wall server splits source-video into sub-images and distributes these sub-images to multiple listening display devices.
  • Built-in algorithms optimize, parse and scale the individual video-wall segments.
  • To accomplish this splitting efficiently it is beneficial to create a configuration file stored in a computer readable medium using information on the position, configuration and settings for each of individual physical display and how they relate to the video-wall canvas.
  • Using such a configuration file allows the video wall server to efficiently create a seamless canvas across the display units.
  • This invention deals with methods of supplying the information for the creation of such files by means of feedback based on test-canvasses and to sequentially changing the configuration file before redeploying a test-canvas to further improve the overall viewer-image.
  • This invention provides methods equipping the server with a configuration file containing:
  • the methods to achieve the five types of adjustments outlined above by automation presented typically involve a user interacting with the server via a GUI containing instructions and a camera in communication with the server.
  • the user would have a smart-phone, tablet, laptop or similar device, interacting with the server via the web.
  • the user giving permission to the server to use that camera to obtain digital images of the canvas as displayed across the video wall and the server giving instructions to the user about positioning he camera and where required to supply eye-based evaluation concerning the correctness of any changes to the displays made.
  • the server knows (via DPMS and EDID) certain details about each display (aspect ratio, number of pixels, etc.). Using these in conjunction with the image captured from the camera gives a unique ability to identify the exact positioning of the display.
  • the ordering and overall shape Once the display units have been mounted to form the wall and connected to the server the server will know the number of display units involved and will analyze for shape. This can be accomplished by sending each display a unique computer-recognizable image. This could for example be a specialized “bar codes” designed for image recognition software (similar to QR codes). The image should have special symbols used to identify the exact spatial location of the corner pixels of each display. Next a message would be sent requesting the user to point the camera at the displays in the wall.
  • Digital analysis of the image in comparison to the information as displayed allows the server to determine which displays are in the wall (some may be displaying in a different room), to identify the geometric placement of the displays (rectangular, linear or “artistic” (meaning in an informal non-geometric setup) and the position in which each signal sent appears in the display (which Ethernet or other connection leads to each display position). In addition it determines the rotation (do the images need to be rotated through 90 or 180 degrees and what rotations are needed for non-standard setups).
  • the server would re-adjust the canvas presented across the screens and instruct the user to ready the camera for another image. This correction process would continue until the server's digital analysis was satisfied with the overall alignment, in addition it might ask for by-eye evaluation to confirm the result.
  • Interactive fine tuning of placement and rotation Generally the canvas on the video wall will appear to be interrupted by the bezels making up the edges of each display monitor.
  • the fine tuning is used to minimize the bezel effect by appropriately moving each of the displays a few pixel widths horizontally or vertically. For example this could be achieved by displaying a test canvas of diagonal lines on the video wall.
  • the digital analysis being aware of the exact location of these lines in the canvas sent to the displays can examine the lines on the digital image very precisely for alignment and by calculation measure the number of pixels each display must be moved vertically or horizontally to achieve perfect alignment. Once these corrections have been made and a new canvas displayed it can be checked digitally and by eye.
  • Adjusting color intensity across the canvas In a typical embodiment the next stage would be to check for color.
  • the canvas might be such that each display contains the same pattern of rectangles each of a different color (perhaps red, blue and green) displayed with a range of intensities.
  • the analysis is of each color intensity across all of the displays, so that any fine distinction between the treatment of a particular color/intensity combination can be adjusted for.
  • Other tests of a similar nature can be used for particular differences between displays.
  • a moving image is output to the video-wall (for example horizontal and lines moving across the video-wall canvas) are captured and communicated in real-time by the camera and image analysis software interprets the captured frames to determine positioning.
  • stage-wise process the methods outlined above are carried out in stages and at each stage the configuration file being used by the server is updated based on the new adjustments calculated, so that the end result is a file that can be used to promote perfect display of any video file presented to the server.
  • Color calibration can be achieved in two possible ways.
  • color calibration is done by controlling monitor settings via the centralized server software being in communication with the display settings (potentially via an RS232 or other interface) and a uniform image canvas is output to the display.
  • color adjustments are stored in the server software and color adjustments are done by the server as it is output to the display itself. In the first of these cases the display settings are permanently stored on the server in a configuration file.
  • the same color is output on each of the displays within the video-wall and after each change in the display, an image is captured for analysis.
  • This image analysis detects relative differences between each display and adjust color output characteristics on individual displays within the video-wall, successively adjusting hue, intensity and brightness of the individual images so that the same high and low values for each display are achievable by each of the individual displays within the video-wall, making the fine adjustments necessary to the color output characteristics and settings of each individual display.
  • the computer-recognizable images output to each of displays includes a right-angled line in various corners of the displays comprising the video-walls to aid in detecting the exact placement of these corners in relation to other display corners within the video-wall.
  • component displays within the video-wall provide instructions to the user on how to connect their camera to the display (for example by providing a URL, to visit on their network-connected or Internet-connected device).
  • Visual prompting and status indicators to assist during video-wall setup.
  • displays are linked into a video-wall it is helpful to the individual setting up the video-wall to receive visual feedback from the displays themselves as screens are added to or removed from the video-wall.
  • visual status indicators shows progress as each display's position within the video-wall has been successfully identified and the display is “linked into” the video-wall. For example, a line, pattern, color change, picture, or animated effect is used to differentiate monitors which have been added or positioned within the video-wall from those that haven't.
  • a different status indicator such as an image, icon, or input prompt could be output to those displays which are being output to by the video-wall server, but are still awaiting placement/assignment/relative-positioning within the video-wall.
  • a status indicates that the edges of both displays have been successfully linked.
  • the full video-wall will show a visual success image indicator spanning the full video-wall.
  • the digital camera device in addition to the image data, also provides meta-data about the image. Data such as: camera orientation, detected ambient light, detected distance from the subject, focal length, shutter speed, flash settings, camera aperture, detected camera rotation angle relative to the horizon, GPS location. this additional data can be used to increase the accuracy or speed of image analysis or provide additional details about the video wall.
  • a smart phone or other mobile device with an embedded camera device is in communication wirelessly with a video wall control server (which is in turn in communication with the video wall displays).
  • the video wall control server outputs one or more optimized configuration images to the video-wall displays.
  • Application code executed on the mobile device either by the browser or by a native mobile device application) captures image data from said camera (this could be a still image, a stream of video data, or a sequence of still images) and forwards this image data over a wireless connection to the server.
  • An an image analysis module processes the captured image data
  • the automated image analysis module is able to determine any adjustments required for mappings of various ones of the displays captured in the image and subsequently translate these adjustments into changes to the video wall configuration mapping file(s) or data stores.
  • the updated mapping would then be communicated by the control module, in response to these changes to the server, server updates test images or sequences to the next test image, repeating any failed steps as necessary and moving to subsequent configuration tests as successful calibration of each unique configuration image is achieved.
  • the user is visiting a web-page with their mobile device (equipped with a camera), and the server is a web-server. That web-server also being in communication (able to send controlling signals) to the displays comprising the video wall. The displays being controlled by the web-server to output the configuration images.
  • the embodiments might employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing. Any of the operations described herein that form part of the embodiments are useful machine operations.
  • the embodiments also relate to a device or an apparatus for performing these operations.
  • the apparatus can be specially constructed for the required purpose, or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer.
  • various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
  • the embodiments can also be embodied as computer readable code on a computer readable medium.
  • the computer readable medium is any data storage device that can store data, which can be thereafter read by a computer system. Examples of the computer readable medium include hard drives, solid state drives (SSD), network attached storage (NAS), read-only memory, random-access memory, Optical discs (CD/DVD/Blu-ray/HD-DVD), magnetic tapes, and other optical and non-optical data storage devices.
  • the computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
  • Embodiments described herein may be practiced with various computer system configurations including hand-held devices, tablets, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like.
  • the embodiments can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
  • FIG. 1 illustrates the basic problem to be solved.
  • FIG. 2 illustrates the need for precise mapping, placement and bezel correction of displays when creating a video wall
  • FIG. 3 shows a schematic diagram depicting a 2 ⁇ 2 video wall
  • FIG. 4 illustrates in detail camera use in combination with uniquely identifying output images.
  • FIG. 5 shows a process flow-chart of automated video-wall calibration.
  • FIG. 6 illustrates the display adjustment process
  • FIG. 7 illustrates a specific embodiment of the whole process
  • FIG. 8 is the flow diagram for an image analysis and detection module.
  • FIG. 1 illustrates the basic problem. It shows a complex video wall layout (multiple displays arranged artistically at multiple angles). It also shows how the image output to each of these displays has been correctly aligned and correctly color calibrated so that the image displays correctly and evenly across on all screens regardless of their placement, rotation, spacing, bezel, etc.
  • the 9 displays comprising the video wall are showing a test pattern illustrating the precise placement of the artistic display orientations within a video-wall. Once the physical video-wall displays have been installed.
  • the mapping and placement of these individual displays within video wall canvas ( 10 ) the on the video-wall server can be accomplished by multiple means, however disclosed herein is an automated method for accomplishing this process adaptable even to complex non-standard video wall layouts such as this.
  • FIG. 2 illustrates the need for precise mapping, placement and bezel correction of displays when creating a video wall.
  • the figure again shows a 9-screen video wall this time arranged in a 3 ⁇ 3 grid configuration video-wall presenting.
  • the output image shown on the two video walls depicted is again an image of diagonal straight lines which have been drawn to span the whole video-wall canvas.
  • the pattern is interrupted by the bezel edges of the nine displays which form pairs of horizontal and vertical interruption bands.
  • FIG. 3 shows a schematic diagram depicting a 2 ⁇ 2 video wall comprising four independent displays working together to output a video wall that has been configured to correctly compensate for the display bezel. Also shown is a tablet computing device with a built in camera ( 31 ) which is being operated by a user ( 32 ) to capture an image of all four of the displays comprising the video wall (along with their output).
  • FIG. 4 illustrates in more detail the method of using a camera in combination with uniquely identifying output images along with automated image analysis to inform the server both of the identity of the individual displays in the video wall as well as the precise relative position and size of the displays in the video-wall.
  • the video-wall server has output a different QR code to each of the nine displays comprising the video wall as seen in the top half of the FIG. ( 41 ). and it also shows a blown up enlarged view of one single display with the QR code ( 45 ) with corner markers ( 46 ) which would be a potential cue used as part of the system to help spatially locate edges and corners correctly through the image capture and analysis process.
  • a schematic is a smart-phone or tablet equipped with a built in camera ( 42 ) being used to capture and relay images of the real-world video-wall output and relay it back to the video-wall web-server, allowing the video-wall server, through automated image recognition and analysis of the captured rendition of the output images, to correctly align and place the individual sub-image segments (and displays) within the video-wall canvas to create a mapping for use when outputting to the video wall.
  • different images are output at different stages in the automated calibration process. For example at the next stage the server might output a line patter for precise calibration and use the image capture of a line pattern across the displays to automatically make the fine adjustments needed to allow for bezel corrections.
  • a color adjustment image (e.g., uniform colors across all displays) might be output across the video-wall using feedback from image analysis of the camera captured image to access differences in color output across various displays comprising the video wall.
  • a second video wall is depicted in the lower half of FIG. 4 showing a non-rectangular video-wall ( 43 ) this time with a different type of identification and calibration image differentiating individual displays. Again the image of the video-wall is captured by a camera, in this case depicted as a mobile phone or tablet ( 44 ).
  • FIG. 5 shows a process flow-chart of automated video-wall calibration ( 50 .
  • video-wall is configured to output unique images to each of the unassigned displays within the system (some of which may be arranged into a video-wall).
  • the user is then prompted (either on the displays or from within an administrative GUI potentially accessed from a mobile device such as a tablet, smart-phone, or laptop) to take a photo of the video-wall (step 52 ).
  • the administrative web-application may request permission to directly access the mobile device's camera.
  • the user then takes a photo of the video-wall and transfers this to the server for automated analysis (step 53 ).
  • the server determines which display is arranged where within the video-wall and also determines the exact spacing, rotation. and placement of the displays by recognizing the distinctive identifying images in the photo and also by calculating the distances between each adjacent display edge within the photo (step 54 ). This information is then used to create a video-wall configuration file which is stored in a computer readable medium (step 55 ).
  • a test pattern is then output to each of the displays comprising the video-wall (the displays that were contained within the photo taken by the administrator in step 52 ) ( 56 ), and it is checked for the accuracy of the completed step and reported either by an automated process or by a human observer ( 57 ). Subsequently further adjustments to this pattern are made if it is not satisfactory and a sequence of new test patterns analyzed ( 58 ) continuing until the total result is optimal and the process ends ( 59 ).
  • FIG. 6 illustrates the display adjustment process ( 60 ).
  • the server outputs the appropriate (as needed at this stage in the process) unique identification images or color calibration image(s) to all the screens in the array ( 61 )
  • the user captures a live image of the screens (as requested by the server) ( 62 ). These are transferred to the server and analyzed ( 63 ). If both the user and server regard the results as satisfactory this step in the adjustment process ends ( 66 ), otherwise.
  • the server adjusts the settings and or relationships between displays in the array based on results of automated image analysis ( 65 ) and outputs appropriate images at ( 61 ) once more.
  • FIG. 7 is an illustration of one specific embodiment of the whole video-wall system process of set-up and use with a networked video wall using zero clients.
  • the user using a browser device with a camera authenticates as a user and authorizes the web calibration process ( 70 ).
  • the server first discovers and connects to the zero-client devices over the network and builds a list of displays and assigns a unique identity to each display ( 71 ). Next it collects the available display resolutions and other available settings from all the connected displays ( 72 ). Then the automated setup process is launched beginning with a browser accessible GUI containing instructions for the user being launched on the user's web-browser device with an embedded camera ( 73 ).
  • the web-server requests permission to access to a camera (if permission in not granted the system will fall back to manual calibration methods). Once permission has been granted the user will be provided with instructions and real time feedback as needed throughout the whole process to assist them to correctly capture an image of the video wall (e.g., it may provide screen recognition features to ensure all screens have been captured from a usable angle).
  • the process outputs unique calibration image optimized for automated image recognition to each display (e.g., QR codes with corner markers for initial identification image) ( 74 ). Capture image(s) and send to web-server for automated image analysis adjusting identification, positioning, and calibration settings and as required ( 75 ).
  • the camera is controlled (either by the user or the web-server) to capture one or more images of the video wall of sufficient resolution to perform the required analysis (e.g., via html media capture or similar method).
  • the images are transmitted to the web-server (wirelessly) for analysis by image analysis module (see flow-chart in FIG. 8 ). Further image data from the camera may also sent (such as orientation, ambient light, GPS, etc.).
  • the preliminary steps in the setup of the video wall system are now complete and the system is ready to process and deliver content to the video wall displays. It can now receive content for display via the primary GPU processing application output frame by frame to the frame buffer ( 78 ), and process (e.g., crop/split/rotate/resize/color-convert) based on stored identity, placement, and calibration settings individual sub-image portions ( 79 ) to be encoded and sent to the appropriate devices for output to the appropriate secondary display adapters which in turn outputs the transformed image data to the corresponding displays ( 710 ), together creating displaying the video wall image across all displays. This decoding and displaying process is continued while the video-wall is in use ( 711 ), and ends when terminated ( 712 ).
  • process e.g., crop/split/rotate/resize/color-convert
  • FIG. 8 is the flow diagram for an image analysis and detection module.
  • the process begins at ( 80 ) and starts by receiving image(s) from camera device and the display information appropriate to the displays that formed the canvas for the camera ( 81 ). It analyzes the newly received image for recognized markers and match to existing data ( 82 ). It performs initial checks on the image and provides error messages to the user as required (e.g., does the number of displays in the captured image match the number of detected displays in communication with the web-server? Is the angle, clarity, and resolution of the image sufficient for automated detection routines?) ( 83 ).
  • the process matches the geometrical model of the video wall (either pre-existing model built at an earlier stage in the automated setup process or built by combining the data retrieved from output to the displays with the display positioning and sizing information obtained through image analysis ( 84 ).
  • the process isolates the captured display area of each detected display and analyzes the image data from each display ( 85 ); and utilizing the spatial information from the captured image automatically determines the placement of this display ( 86 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Nonlinear Science (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • User Interface Of Digital Computer (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

A system is disclosed for identifying, placing and configuring a physical arrangement of a plurality of displays via image analysis of captured digital camera images depicting unique configuration images output to said displays to facilitate uniform operation of said plurality of displays as a single display area for example as a video wall. The system pairs and configures displays depicted in the captured images to individual displays within the physical arrangement through controlling and analyzing of the output of said displays captured in said images. A method and computer readable medium are also disclosed that operate in accordance with the system.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority from U.S. Provisional Patent Application No. 61/926,295 filed on Jan. 12, 2014, which is hereby incorporated by reference.
  • FIELD OF INVENTION
  • Large electronic displays, may be formed from an array of monitors referred to as a “video-wall”. For example video-wall might be comprised of a 3 by 3 array of nine monitor, each monitor simultaneously displaying a segment of a single image, thereby creating the appearance of a single large display comprised of rectangular portions.
  • The present invention relates generally to improving the setup and operation of large displays and particularly to network addressable video-wall displays.
  • BACKGROUND OF THE INVENTION
  • The present invention relates generally to improving the setup and operation of video-wall displays and particularly to network addressable displays.
  • A video-wall display system is a method to overcome the costs of manufacturing and installing very large displays, by assembling a large display using multiple smaller displays arranged and working together. By dividing a single image into several sub-images and displaying the sub-images on an appropriately arranged array of display devices a larger display with higher resolution can be created.
  • Because the plurality of display devices need to be operated together to display a single image or canvas across a video-wall (rather than a separate independent image for each display), the set-up of the output displays is critical and their fine tuning can be laborious. Informing the server of the initial positioning of each display (so that the image segments are sent to the appropriate displays); the precise cropping of each of the sub-images (to allow the eye to interpret continuity of the total image across the bezels of the displays where no image can appear); and the adjustment of the color of the sub-segments of the image to provide equal luminosity, color and intensity/brightness ranges across the whole array of displays within the video-wall, are all essential to providing the optimal viewing experience. With conventional approaches to video-wall setup these tasks can be laborious. This invention offers methods of automating the setup process to improve the ease and speed of video-wall setup.
  • DESCRIPTION OF THE INVENTION
  • A video wall server splits source-video into sub-images and distributes these sub-images to multiple listening display devices. Built-in algorithms optimize, parse and scale the individual video-wall segments. To accomplish this splitting efficiently it is beneficial to create a configuration file stored in a computer readable medium using information on the position, configuration and settings for each of individual physical display and how they relate to the video-wall canvas. Using such a configuration file allows the video wall server to efficiently create a seamless canvas across the display units. This invention deals with methods of supplying the information for the creation of such files by means of feedback based on test-canvasses and to sequentially changing the configuration file before redeploying a test-canvas to further improve the overall viewer-image.
  • Configuration of Displays: This invention provides methods equipping the server with a configuration file containing:
      • the overall shape of the video wall;
      • the ordering of the sub-images within the video wall;
      • any further rotation or displacement of displays required to form the appropriate canvas on the video wall;
      • interactively fine-tuning the positioning and bezel width of the displays to achieve perfect alignment across display monitor bezels;
      • adjusting the color intensity of displays to achieve a uniform color across the video-wall;
        Once this information is established it is stored in the server's configuration files.
  • The methods to achieve the five types of adjustments outlined above by automation presented typically involve a user interacting with the server via a GUI containing instructions and a camera in communication with the server. In a typical usage the user would have a smart-phone, tablet, laptop or similar device, interacting with the server via the web. The user giving permission to the server to use that camera to obtain digital images of the canvas as displayed across the video wall and the server giving instructions to the user about positioning he camera and where required to supply eye-based evaluation concerning the correctness of any changes to the displays made.
  • The server knows (via DPMS and EDID) certain details about each display (aspect ratio, number of pixels, etc.). Using these in conjunction with the image captured from the camera gives a unique ability to identify the exact positioning of the display.
  • The ordering and overall shape. Once the display units have been mounted to form the wall and connected to the server the server will know the number of display units involved and will analyze for shape. This can be accomplished by sending each display a unique computer-recognizable image. This could for example be a specialized “bar codes” designed for image recognition software (similar to QR codes). The image should have special symbols used to identify the exact spatial location of the corner pixels of each display. Next a message would be sent requesting the user to point the camera at the displays in the wall. Digital analysis of the image in comparison to the information as displayed allows the server to determine which displays are in the wall (some may be displaying in a different room), to identify the geometric placement of the displays (rectangular, linear or “artistic” (meaning in an informal non-geometric setup) and the position in which each signal sent appears in the display (which Ethernet or other connection leads to each display position). In addition it determines the rotation (do the images need to be rotated through 90 or 180 degrees and what rotations are needed for non-standard setups).
  • Once the digital image analysis has been completed the server would re-adjust the canvas presented across the screens and instruct the user to ready the camera for another image. This correction process would continue until the server's digital analysis was satisfied with the overall alignment, in addition it might ask for by-eye evaluation to confirm the result.
  • Interactive fine tuning of placement and rotation. Generally the canvas on the video wall will appear to be interrupted by the bezels making up the edges of each display monitor. The fine tuning is used to minimize the bezel effect by appropriately moving each of the displays a few pixel widths horizontally or vertically. For example this could be achieved by displaying a test canvas of diagonal lines on the video wall. The digital analysis being aware of the exact location of these lines in the canvas sent to the displays can examine the lines on the digital image very precisely for alignment and by calculation measure the number of pixels each display must be moved vertically or horizontally to achieve perfect alignment. Once these corrections have been made and a new canvas displayed it can be checked digitally and by eye.
  • Adjusting color intensity across the canvas. In a typical embodiment the next stage would be to check for color. The canvas might be such that each display contains the same pattern of rectangles each of a different color (perhaps red, blue and green) displayed with a range of intensities. Now the analysis is of each color intensity across all of the displays, so that any fine distinction between the treatment of a particular color/intensity combination can be adjusted for. Other tests of a similar nature can be used for particular differences between displays.
  • In an alternative and potentially complimentary method of calibration a moving image is output to the video-wall (for example horizontal and lines moving across the video-wall canvas) are captured and communicated in real-time by the camera and image analysis software interprets the captured frames to determine positioning.
  • The stage-wise process the methods outlined above are carried out in stages and at each stage the configuration file being used by the server is updated based on the new adjustments calculated, so that the end result is a file that can be used to promote perfect display of any video file presented to the server. Color calibration can be achieved in two possible ways.
  • In one embodiment of the invention color calibration is done by controlling monitor settings via the centralized server software being in communication with the display settings (potentially via an RS232 or other interface) and a uniform image canvas is output to the display. In an alternative embodiment color adjustments are stored in the server software and color adjustments are done by the server as it is output to the display itself. In the first of these cases the display settings are permanently stored on the server in a configuration file.
  • In one realization the same color is output on each of the displays within the video-wall and after each change in the display, an image is captured for analysis. This image analysis detects relative differences between each display and adjust color output characteristics on individual displays within the video-wall, successively adjusting hue, intensity and brightness of the individual images so that the same high and low values for each display are achievable by each of the individual displays within the video-wall, making the fine adjustments necessary to the color output characteristics and settings of each individual display.
  • In one embodiment of the invention the computer-recognizable images output to each of displays includes a right-angled line in various corners of the displays comprising the video-walls to aid in detecting the exact placement of these corners in relation to other display corners within the video-wall.
  • In another embodiment of the invention, component displays within the video-wall provide instructions to the user on how to connect their camera to the display (for example by providing a URL, to visit on their network-connected or Internet-connected device).
  • Visual prompting and status indicators to assist during video-wall setup. As displays are linked into a video-wall it is helpful to the individual setting up the video-wall to receive visual feedback from the displays themselves as screens are added to or removed from the video-wall. In one embodiment of the invention, visual status indicators shows progress as each display's position within the video-wall has been successfully identified and the display is “linked into” the video-wall. For example, a line, pattern, color change, picture, or animated effect is used to differentiate monitors which have been added or positioned within the video-wall from those that haven't. A different status indicator such as an image, icon, or input prompt could be output to those displays which are being output to by the video-wall server, but are still awaiting placement/assignment/relative-positioning within the video-wall. In one embodiment, once an an adjacency relationship is established between edges of displays within the video-wall a status indicates that the edges of both displays have been successfully linked. In one embodiment, once the full video-wall has been setup, will show a visual success image indicator spanning the full video-wall.
  • In one embodiment of the invention, in addition to the image data, the digital camera device also provides meta-data about the image. Data such as: camera orientation, detected ambient light, detected distance from the subject, focal length, shutter speed, flash settings, camera aperture, detected camera rotation angle relative to the horizon, GPS location. this additional data can be used to increase the accuracy or speed of image analysis or provide additional details about the video wall.
  • In one embodiment of the invention, a smart phone or other mobile device with an embedded camera device is in communication wirelessly with a video wall control server (which is in turn in communication with the video wall displays). The video wall control server outputs one or more optimized configuration images to the video-wall displays. Application code executed on the mobile device, either by the browser or by a native mobile device application) captures image data from said camera (this could be a still image, a stream of video data, or a sequence of still images) and forwards this image data over a wireless connection to the server.
  • An an image analysis module (could be either executed on the server or on the mobile device, or parts of the analysis could be performed by each) processes the captured image data
  • determining the display identity and placing each within the captured image then subsequently assessing differences in display placement, rotation, color, brightness, contrast, and other attributes of the various displays present within the capture image data. Via these comparisons the automated image analysis module is able to determine any adjustments required for mappings of various ones of the displays captured in the image and subsequently translate these adjustments into changes to the video wall configuration mapping file(s) or data stores. The updated mapping would then be communicated by the control module, in response to these changes to the server, server updates test images or sequences to the next test image, repeating any failed steps as necessary and moving to subsequent configuration tests as successful calibration of each unique configuration image is achieved.
  • In one embodiment the user is visiting a web-page with their mobile device (equipped with a camera), and the server is a web-server. That web-server also being in communication (able to send controlling signals) to the displays comprising the video wall. The displays being controlled by the web-server to output the configuration images.
  • With the above embodiments in mind, it should be understood that the embodiments might employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing. Any of the operations described herein that form part of the embodiments are useful machine operations. The embodiments also relate to a device or an apparatus for performing these operations. The apparatus can be specially constructed for the required purpose, or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
  • The embodiments can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data, which can be thereafter read by a computer system. Examples of the computer readable medium include hard drives, solid state drives (SSD), network attached storage (NAS), read-only memory, random-access memory, Optical discs (CD/DVD/Blu-ray/HD-DVD), magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion. Embodiments described herein may be practiced with various computer system configurations including hand-held devices, tablets, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The embodiments can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
  • Although the method operations were described in a specific order, it should be understood that other operations may be performed in between described operations, described operations may be adjusted so that they occur at slightly different times or the described operations may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing.
  • While the system and method has been described in conjunction with several specific embodiments, it is evident to those skilled in the art that many further alternatives, modifications and variations will be apparent in light of the foregoing description. Thus, the embodiments described herein are intended to embrace all such alternatives, modifications, applications and variations as may fall within the spirit and scope of the appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments will now be described more fully with reference to the accompanying drawings in which:
  • FIG. 1 illustrates the basic problem to be solved.
  • FIG. 2 illustrates the need for precise mapping, placement and bezel correction of displays when creating a video wall
  • FIG. 3 shows a schematic diagram depicting a 2×2 video wall
  • FIG. 4 illustrates in detail camera use in combination with uniquely identifying output images.
  • FIG. 5 shows a process flow-chart of automated video-wall calibration.
  • FIG. 6 illustrates the display adjustment process.
  • FIG. 7 illustrates a specific embodiment of the whole process
  • FIG. 8 is the flow diagram for an image analysis and detection module.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates the basic problem. It shows a complex video wall layout (multiple displays arranged artistically at multiple angles). It also shows how the image output to each of these displays has been correctly aligned and correctly color calibrated so that the image displays correctly and evenly across on all screens regardless of their placement, rotation, spacing, bezel, etc. The 9 displays comprising the video wall are showing a test pattern illustrating the precise placement of the artistic display orientations within a video-wall. Once the physical video-wall displays have been installed. The mapping and placement of these individual displays within video wall canvas (10) the on the video-wall server can be accomplished by multiple means, however disclosed herein is an automated method for accomplishing this process adaptable even to complex non-standard video wall layouts such as this.
  • FIG. 2 illustrates the need for precise mapping, placement and bezel correction of displays when creating a video wall. The figure again shows a 9-screen video wall this time arranged in a 3×3 grid configuration video-wall presenting. The output image shown on the two video walls depicted is again an image of diagonal straight lines which have been drawn to span the whole video-wall canvas. In both versions the pattern is interrupted by the bezel edges of the nine displays which form pairs of horizontal and vertical interruption bands. However in the upper illustration of FIG. 2 (where the display placement within the video wall canvas has not been corrected/adjusted to account for bezel interruptions) you can see that the lines on the video-wall canvas do not align correctly hence to not appearing as portions of a single straight line. This is emphasized by the overlayed dotted straight diagonal lines (21) and (22). Notice that the lines on the non-aligned (non-bezel corrected) canvas do not exactly follow these ruled lines. The bottom half of the figure illustrates the view after effective bezel adjustments have been performed: the diagonals on the canvas line up accurately and the fine lines added at (23) and (24) confirm this. In practice such lines cannot be generated by the video wall server, and evaluation of alignment must be observed by a viewer either by eye or by automated image capture (method for which is disclosed herein).
  • FIG. 3 shows a schematic diagram depicting a 2×2 video wall comprising four independent displays working together to output a video wall that has been configured to correctly compensate for the display bezel. Also shown is a tablet computing device with a built in camera (31) which is being operated by a user (32) to capture an image of all four of the displays comprising the video wall (along with their output).
  • FIG. 4 illustrates in more detail the method of using a camera in combination with uniquely identifying output images along with automated image analysis to inform the server both of the identity of the individual displays in the video wall as well as the precise relative position and size of the displays in the video-wall. In this example, the video-wall server has output a different QR code to each of the nine displays comprising the video wall as seen in the top half of the FIG. (41). and it also shows a blown up enlarged view of one single display with the QR code (45) with corner markers (46) which would be a potential cue used as part of the system to help spatially locate edges and corners correctly through the image capture and analysis process. Also depicted is a schematic is a smart-phone or tablet equipped with a built in camera (42) being used to capture and relay images of the real-world video-wall output and relay it back to the video-wall web-server, allowing the video-wall server, through automated image recognition and analysis of the captured rendition of the output images, to correctly align and place the individual sub-image segments (and displays) within the video-wall canvas to create a mapping for use when outputting to the video wall. In one embodiment different images are output at different stages in the automated calibration process. For example at the next stage the server might output a line patter for precise calibration and use the image capture of a line pattern across the displays to automatically make the fine adjustments needed to allow for bezel corrections. Finally a color adjustment image (e.g., uniform colors across all displays) might be output across the video-wall using feedback from image analysis of the camera captured image to access differences in color output across various displays comprising the video wall. A second video wall is depicted in the lower half of FIG. 4 showing a non-rectangular video-wall (43) this time with a different type of identification and calibration image differentiating individual displays. Again the image of the video-wall is captured by a camera, in this case depicted as a mobile phone or tablet (44).
  • FIG. 5 shows a process flow-chart of automated video-wall calibration (50. Here (in step 51) video-wall is configured to output unique images to each of the unassigned displays within the system (some of which may be arranged into a video-wall). The user is then prompted (either on the displays or from within an administrative GUI potentially accessed from a mobile device such as a tablet, smart-phone, or laptop) to take a photo of the video-wall (step 52). In the case that the user is using a mobile device, the administrative web-application may request permission to directly access the mobile device's camera. The user then takes a photo of the video-wall and transfers this to the server for automated analysis (step 53). The server then (using automated image analysis of the specially output images) determines which display is arranged where within the video-wall and also determines the exact spacing, rotation. and placement of the displays by recognizing the distinctive identifying images in the photo and also by calculating the distances between each adjacent display edge within the photo (step 54). This information is then used to create a video-wall configuration file which is stored in a computer readable medium (step 55). A test pattern is then output to each of the displays comprising the video-wall (the displays that were contained within the photo taken by the administrator in step 52) (56), and it is checked for the accuracy of the completed step and reported either by an automated process or by a human observer (57). Subsequently further adjustments to this pattern are made if it is not satisfactory and a sequence of new test patterns analyzed (58) continuing until the total result is optimal and the process ends (59).
  • FIG. 6 illustrates the display adjustment process (60). The server outputs the appropriate (as needed at this stage in the process) unique identification images or color calibration image(s) to all the screens in the array (61) The user captures a live image of the screens (as requested by the server) (62). These are transferred to the server and analyzed (63). If both the user and server regard the results as satisfactory this step in the adjustment process ends (66), otherwise. The server adjusts the settings and or relationships between displays in the array based on results of automated image analysis (65) and outputs appropriate images at (61) once more.
  • FIG. 7 is an illustration of one specific embodiment of the whole video-wall system process of set-up and use with a networked video wall using zero clients. The user using a browser device with a camera authenticates as a user and authorizes the web calibration process (70). The server first discovers and connects to the zero-client devices over the network and builds a list of displays and assigns a unique identity to each display (71). Next it collects the available display resolutions and other available settings from all the connected displays (72). Then the automated setup process is launched beginning with a browser accessible GUI containing instructions for the user being launched on the user's web-browser device with an embedded camera (73). Initially the web-server requests permission to access to a camera (if permission in not granted the system will fall back to manual calibration methods). Once permission has been granted the user will be provided with instructions and real time feedback as needed throughout the whole process to assist them to correctly capture an image of the video wall (e.g., it may provide screen recognition features to ensure all screens have been captured from a usable angle). The process outputs unique calibration image optimized for automated image recognition to each display (e.g., QR codes with corner markers for initial identification image) (74). Capture image(s) and send to web-server for automated image analysis adjusting identification, positioning, and calibration settings and as required (75). The camera is controlled (either by the user or the web-server) to capture one or more images of the video wall of sufficient resolution to perform the required analysis (e.g., via html media capture or similar method). The images are transmitted to the web-server (wirelessly) for analysis by image analysis module (see flow-chart in FIG. 8). Further image data from the camera may also sent (such as orientation, ambient light, GPS, etc.). Update the settings file representing the positions and order of the display units as communicated in automated analysis. As needed throughout the whole calibration process update this file and update both the user instructions and individual displays as changes occur to this file. Check the results of the calibration (710) and evaluate if the current step is satisfactory in all respects (and if not continue calibration at (74)). If it is satisfactory, proceed to the next set of calibration images, for example line calibration images to fine-tune display placement or color calibration once no further calibration steps remain calculate canvas size and position of displays within canvas and write all sub-image mapping info to the settings file.
  • The preliminary steps in the setup of the video wall system are now complete and the system is ready to process and deliver content to the video wall displays. It can now receive content for display via the primary GPU processing application output frame by frame to the frame buffer (78), and process (e.g., crop/split/rotate/resize/color-convert) based on stored identity, placement, and calibration settings individual sub-image portions (79) to be encoded and sent to the appropriate devices for output to the appropriate secondary display adapters which in turn outputs the transformed image data to the corresponding displays (710), together creating displaying the video wall image across all displays. This decoding and displaying process is continued while the video-wall is in use (711), and ends when terminated (712).
  • FIG. 8 is the flow diagram for an image analysis and detection module. The process begins at (80) and starts by receiving image(s) from camera device and the display information appropriate to the displays that formed the canvas for the camera (81). It analyzes the newly received image for recognized markers and match to existing data (82). It performs initial checks on the image and provides error messages to the user as required (e.g., does the number of displays in the captured image match the number of detected displays in communication with the web-server? Is the angle, clarity, and resolution of the image sufficient for automated detection routines?) (83). Next it matches the geometrical model of the video wall (either pre-existing model built at an earlier stage in the automated setup process or built by combining the data retrieved from output to the displays with the display positioning and sizing information obtained through image analysis (84). Next the process isolates the captured display area of each detected display and analyzes the image data from each display (85); and utilizing the spatial information from the captured image automatically determines the placement of this display (86). By comparing the captured display area of each of the plurality of displays for perceived differences in the unique configuration images (87), it determines the adjustments to the mapping settings as required for the identity, position, size and output characteristics of the plurality of displays visible within the captured image (88), It returns the adjusted mapping settings for the plurality of displays based on required adjustments determined in the previous step (89). This ends the current step in image analysis (810) and allows a new new and better canvas to be displayed using the modifications to the updated configuration file.

Claims (20)

What is claimed is:
1. A control module in communication with each of said plurality of displays, configured to receive display information from, and provide output commands to, individual ones of the plurality of displays;
unique configuration images, designed to be interpretable via computerized image analysis of their captured output, to provide information on ones of identity, edges, corners, color characteristics, settings, size and placement of individual displays. Said unique configuration images being output to individual ones of the plurality of displays, in response to commands from the control module;
a digital camera device in communication with the control module, being configured to capture and send for analysis digital camera images depicting ones of the plurality of displays including the unique configuration images output thereupon at the time of capture;
an automated image analysis module, in communication with the control module, for receiving and analyzing said digital camera images, said analyzing comprising:
isolating image data from the unique configuration images output thereupon;
pairing ones of the depicted displays in digital camera images to corresponding ones of the plurality of displays;
deriving individual display mapping data relative to ones of identity, position, placement, rotation, settings and color for ones of the displays within the video wall;
said mapping data being stored in computer readable memory and applied to facilitate uniformity of output of said plurality of displays.
2. The system of claim 1, further comprising a Graphical User Interface (GUI) module consisting of a user interacting with a web-page being rendered by a web-browser running on a web-browsing device comprising a digital camera, the web-browser in communication with the control module, being configured to request the user to grant camera access and capture digital images of the plurality of displays.
3. The system of claim 2, further comprising the automated method being used in conjunction with a GUI controlled by a user certain ones of the setup and configuration information required being provided by the user other being performed via the automated image analysis module.
4. The system of claim 3, further comprising the GUI being configured to display a graphical representation of the mapping comprising a plurality of blocks, each block a representing, and corresponding to, one of devices comprising the video wall, the user being able to manipulate elements of the display to further adjust the mapping data.
5. The system of claim 2, where the digital camera device embedded within a smart-phone device and the GUI is provided by a native smart-phone application in communication with the control module over wireless network connection.
6. The system of claim 2, wherein the user is interacting with the GUI via ones of:
a web browser;
a laptop;
a smartphone;
a tablet;
a personal computer;
a mobile device;
a touch-screen;
a mouse;
a keyboard;
an input device;
voice commands;
gesture input;
touch input.
7. The system of claim 1, wherein the control module further comprises a web-server running an embedded PC housed within at least one of the plurality of displays.
8. The system of claim 1, further comprising the plurality of displays being updated to output, using the mapping data, at least one image spanning the plurality of displays.
9. The system of claim 1, being further configured to perform the outputting (of the unique configuration images), capturing (via a digital camera device), and analyzing (to derive mapping data) multiple times in sequence, each time utilizing the updated mapping data and each time further facilitating uniformity of output to the plurality of displays, the output of subsequent unique configuration images being controlled by the system.
10. The system of claim 1 wherein, the updating of the mapping data based on digital image analysis performed by the automated image analysis module includes ones of:
adjusting the aspect ratio or size of the video-wall canvas to match one or more of the bounding edges of of the total display canvas captured by the camera;
spatially positioning (shifting and rotating) of ones of the displays based on detected markers;
adjusting the relative size of each display based on detected locations of display corner markers;
modifying the positioning and scaling of the images in response to detected physical display sizes;
increasing or decreasing the relative brightness settings for image data sent to individual ones of the displays;
increasing or decreasing various color settings for image data sent to individual ones of the displays;
increasing or decreasing various color settings in communication with the display itself via a communications protocol;
detecting the size of the bezel for ones of the displays.
11. The system of claim 1, wherein, the sending for analysis of the captured images comprises wireless transmission of image data from the digital camera device over a wireless communication network.
12. The system of claim 1, further comprising the digital camera device supplying additional meta-data about the captured image comprising ones of: camera orientation, detected ambient light, detected distance from the subject, focal length, shutter speed, flash settings, camera aperture, detected camera rotation angle relative to the horizon, GPS location these additional data being used to increase the accuracy or speed of image analysis or provide additional details about the video wall.
13. The system of claim 1, wherein, visual elements, being specific unique identification symbols, are used in the configuration images to facilitate assessing ones of the identity, relative position rotation and color of the displays, these visual elements being ones of:
embedded QR codes;
specific corner markers; to facilitate spatial location of corners of the display;
specific edge markers;
linear patterns across the canvas as a way of assessing continuity across bezel edges between different displays;
individual pixels at the edge of each display are illuminated to ensure they are visible within the canvas providing an edge-check method;
specific color(s) as a means of assessing color uniformity between multiple ones of the displays;
QR code indicating embedded display identity within the image lines proximal to display edges indicating display edges;
markers proximal to display corners indicating display corners;
solid blocks of color depicting color characteristics;
settings;
corner and edge markers depicting relative display size;
a sequence of lines spanning the multiple displays within the video wall canvas facilitating precise positioning of displays;
a uniform color across all displays.
14. The system of claim 1, where the image analysis software corrects for planar spatial analysis based on the position and angle of the camera.
15. The system of claim 1, where the display information received from the display via the control module includes display sizing and resolution information. The automated image analysis module further comprising used this sizing and resolution information to assist in paring ones of the depicted displays.
16. The system of claim 1, where several images are used in rotation to precisely determine alignment, the images comprising:
at least one an identification image to determine the identify of each display;
at least one a corner coordinates image to determine the spacing rotation and placement of displays;
at least one color calibration images to match and calibrate color amongst multiple displays.
17. The system of claim 1, further comprising error checks being performed on captured image data either prior to or after sending for analysis, where checks and feedback to the camera operator form part of the are performed on the captured image and error messages are generated for output to the user, said detected errors conditions comprising ones of:
the detected number of displays in the captured image not match the detected number of displays in communication with the server the incidence-angle of the captured image deviating too far from the recommended 90 deg angle;
the clarity, contrast, and resolution of the captured image being sub-optimal for automated detection routines;
captured image being too far or too close to the video wall;
light or flash reflections being too strong for image detection.
18. A computer implemented method of adjusting, within a video-wall canvas, ones of identity, placement, color characteristics and configuration of individual ones of a plurality of displays by a control module in communication with each of said plurality of displays, the control module also being in communication with an image analysis module, the image analysis module also being in communication with a digital camera device, in order to facilitate the operation of said plurality of displays as a video-wall, the method comprising:
detecting the plurality of displays;
retrieving information from said displays;
generating of unique configuration images, the configuration images having been designed to communicate, via computerized image analysis, ones of the corresponding display's identity, edges and corners, placement within the canvas and color calibration;
creating a test canvas based on said configuration images for outputting said unique configuration images to individual ones of the plurality of displays;
outputting the said test canvas to the displays capturing via the digital camera device digital images of the plurality of displays including the unique configuration images output thereupon;
retrieving by the image analysis module over a network said digital images for analysis;
analyzing, by the image analysis module, of the received digital images;
pairing ones of the depicted displays in digital camera images to corresponding ones of the plurality of displays;
deriving individual display mapping data relative to ones of identity, position, placement, rotation, settings and color for ones of the displays within the video wall;
adjusting in response to the analyzing said identification, placement, and configuration for individual ones of the plurality of displays;
storing said settings in computer readable memory;
applying the updated settings to facilitate uniformity of output through an updated canvas.
19. The method of claim 18, further comprising a Graphical User Interface (GUI) consisting of a user interacting with a web-page being rendered by a web-browser running on a web-browsing device comprising a digital camera, the web-browser in communication with the control module, being configured to request the user to grant camera access and capture digital images of the plurality of displays.
20. A computer-readable medium storing one or more computer readable instructions configured to cause one or more processors to:
display, via a control module in communication with each of a plurality of
displays, unique configuration images, said images designed to be interpretable via computerized image analysis of their captured output, to provide information information on ones of identity, edges, corners, color characteristics, settings, size and placement of individual displays in the form of a test canvas;
receive, via the control module, display information from individual ones of the plurality of displays:
receive, via the control module, images from a digital camera device configured to capture, and send for analysis, digital images depicting ones of the plurality of displays including the unique configuration images output thereupon at the time of capture;
deliver via the control module both said digital images and the said test canvas as displayed to an automated image analysis module, for analysis of the digital camera images;
analyze the images in the automated image analysis module, said analyzing comprising:
isolating image data from the unique configuration images output thereupon;
pairing ones of the depicted displays in digital camera images to corresponding ones of the plurality of displays;
deriving individual display mapping data relative to ones of identity, position, placement, rotation, settings and color for ones of the displays within the physical arrangement;
retrieve, via the control module, the individual display mapping data write, via the control module, configuration file to permanent storage;
US14/595,203 2014-01-11 2015-01-12 System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis Abandoned US20150279037A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/595,203 US20150279037A1 (en) 2014-01-11 2015-01-12 System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461926295P 2014-01-11 2014-01-11
US14/595,203 US20150279037A1 (en) 2014-01-11 2015-01-12 System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis

Publications (1)

Publication Number Publication Date
US20150279037A1 true US20150279037A1 (en) 2015-10-01

Family

ID=54191113

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/595,176 Abandoned US20150286456A1 (en) 2014-01-11 2015-01-12 Method and System of Video Wall Setup and Adjustment Using GUI and Display Images
US14/594,590 Active US9911176B2 (en) 2014-01-11 2015-01-12 System and method of processing images into sub-image portions for output to a plurality of displays such as a network video wall
US14/595,203 Abandoned US20150279037A1 (en) 2014-01-11 2015-01-12 System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/595,176 Abandoned US20150286456A1 (en) 2014-01-11 2015-01-12 Method and System of Video Wall Setup and Adjustment Using GUI and Display Images
US14/594,590 Active US9911176B2 (en) 2014-01-11 2015-01-12 System and method of processing images into sub-image portions for output to a plurality of displays such as a network video wall

Country Status (1)

Country Link
US (3) US20150286456A1 (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150049117A1 (en) * 2012-02-16 2015-02-19 Seiko Epson Corporation Projector and method of controlling projector
US20160165229A1 (en) * 2014-12-05 2016-06-09 Aten International Co., Ltd. Calibration system and method for multi-display system
US20160162240A1 (en) * 2013-07-29 2016-06-09 Samsung Electronics Co., Ltd. Method and apparatus for constructing multi-screen display
US20160165226A1 (en) * 2014-12-04 2016-06-09 Spirent Communications, Inc. Video streaming and video telephony downlink performance analysis system
US20160224300A1 (en) * 2015-01-29 2016-08-04 Electronics And Telecommunications Research Institute Method of providing additional information of content, mobile terminal and content control server
US20160277244A1 (en) * 2015-03-18 2016-09-22 ThePlatform, LLC. Methods And Systems For Content Presentation Optimization
TWI588738B (en) * 2016-02-05 2017-06-21 韌硬軟機電股份有限公司 Display system for an array of video displays
US20170195611A1 (en) * 2016-01-05 2017-07-06 Samsung Electronics Co., Ltd. Display system, display apparatus, and controlling method thereof
WO2017155237A1 (en) * 2016-03-09 2017-09-14 Samsung Electronics Co., Ltd. Configuration and operation of display devices including device management
US20170295349A1 (en) * 2016-04-11 2017-10-12 Seiko Epson Corporation Projection apparatus, projector, and projector controlling method
US20170337028A1 (en) * 2016-05-17 2017-11-23 Qisda Corporation Method and system for modular display frame
US20180024800A1 (en) * 2016-07-21 2018-01-25 Disney Enterprises, Inc. Display device array
US9911176B2 (en) * 2014-01-11 2018-03-06 Userful Corporation System and method of processing images into sub-image portions for output to a plurality of displays such as a network video wall
US20180107358A1 (en) * 2016-10-17 2018-04-19 International Business Machines Corporation Multiple-display unification system and method
US20180113664A1 (en) * 2016-01-08 2018-04-26 Boe Technology Group Co., Ltd. Display device, method and device for adjusting information channels thereof
KR20180072337A (en) * 2016-12-21 2018-06-29 삼성전자주식회사 Display apparatus, electronic apparatus, display system comprising display appartus and electronic apparatus, and control methods thereof
KR20180074405A (en) * 2016-12-23 2018-07-03 삼성전자주식회사 Electronic device and Method for controlling the electronic device thereof
US20180204219A1 (en) * 2017-01-18 2018-07-19 International Business Machines Corporation Display and shelf space audit system
US20180260185A1 (en) * 2017-03-07 2018-09-13 Sprinklr, Inc. System for discovering configuration of display wall
US20180307455A1 (en) * 2017-04-19 2018-10-25 International Business Machines Corporation Automatic real-time configuration of a multi-head display system
US10353660B2 (en) * 2016-09-15 2019-07-16 Panasonic Intellectual Property Management Co., Ltd. Image display system
US20190244561A1 (en) * 2016-11-17 2019-08-08 Xi'an Novastar Tech Co., Ltd. Pixel-by-pixel calibration method
US10394512B2 (en) * 2017-07-06 2019-08-27 Amzetta Technologies, Llc Multi-monitor alignment on a thin client
US20190278551A1 (en) * 2018-03-06 2019-09-12 Silicon Video Systems, Inc. Variable layout module
US10417991B2 (en) * 2017-08-18 2019-09-17 Microsoft Technology Licensing, Llc Multi-display device user interface modification
US10489099B2 (en) * 2016-03-04 2019-11-26 Boe Technology Group Co., Ltd. Spliced panel and method and device for automatically allocating content to be display on spliced panel
US20190369941A1 (en) * 2018-06-05 2019-12-05 Videri Inc. Systems and methods for mapping an orchestrated digital display system
WO2020085666A1 (en) 2018-10-26 2020-04-30 Samsung Electronics Co., Ltd. Electronic device and control method thereof
CN111385521A (en) * 2018-12-27 2020-07-07 浙江宇视科技有限公司 A method and decoding device for distributed display of user interface
WO2020159185A1 (en) * 2019-01-31 2020-08-06 Samsung Electronics Co., Ltd. Electronic device and control method thereof
CN111694534A (en) * 2020-06-15 2020-09-22 京东方科技集团股份有限公司 Method and device for acquiring display unit information of spliced screen
CN111694528A (en) * 2019-03-12 2020-09-22 纬创资通股份有限公司 Method for identifying typesetting of display wall and electronic device using same
WO2020232537A1 (en) * 2019-05-17 2020-11-26 Fenoto Technologies Inc. Electronic paper display system
CN112433688A (en) * 2019-08-26 2021-03-02 杭州海康威视数字技术股份有限公司 Image display method and device and spliced screen
WO2021054511A1 (en) * 2019-09-17 2021-03-25 삼성전자주식회사 Electronic device and control method therefor
RU2748176C1 (en) * 2020-05-18 2021-05-20 Общество с ограниченной ответственностью "Ай Ти Ви групп" System and method of adjustment of data display on video wall
US11062251B2 (en) 2015-01-23 2021-07-13 Sprinklr, Inc. Multi-dimensional command center
RU205445U1 (en) * 2021-03-18 2021-07-14 Общество с ограниченной ответственностью «Научно-Технический Центр ПРОТЕЙ» (ООО «НТЦ ПРОТЕЙ») Distributed Controller Video Wall
US11074028B2 (en) * 2017-07-26 2021-07-27 Barco N.V. Calibration method and system for tiled displays
TWI740623B (en) * 2020-08-26 2021-09-21 財團法人資訊工業策進會 Apparatus, method, and computer program product thereof for integrating videos
US11163516B2 (en) * 2017-11-09 2021-11-02 Samsung Electronics Co., Ltd. Electronic apparatus, display apparatus, and multivision setting method
US20220012000A1 (en) * 2021-09-24 2022-01-13 Intel Corporation Visually assisted multi-display configuration
US11237699B2 (en) 2017-08-18 2022-02-01 Microsoft Technology Licensing, Llc Proximal menu generation
US11244363B1 (en) 2018-10-25 2022-02-08 Sprinklr, Inc. Rating and review integration system
US20220059053A1 (en) * 2020-08-19 2022-02-24 Beijing Institute Of Technology Video display method, video display system, electronic device, and storage medium
US11301124B2 (en) 2017-08-18 2022-04-12 Microsoft Technology Licensing, Llc User interface modification using preview panel
US11386178B2 (en) 2019-06-20 2022-07-12 Sprinklr, Inc. Enhanced notification system for real time control center
US11393411B2 (en) * 2018-12-07 2022-07-19 Sharp Nec Display Solutions, Ltd. Multi-display system and method for adjusting multi-display system
US11397923B1 (en) 2019-10-07 2022-07-26 Sprinklr, Inc. Dynamically adaptive organization mapping system
US11405695B2 (en) 2019-04-08 2022-08-02 Spirent Communications, Inc. Training an encrypted video stream network scoring system with non-reference video scores
CN115039412A (en) * 2020-02-27 2022-09-09 爱幕赴株式会社 Image display apparatus, system and method
US20220343834A1 (en) * 2021-04-23 2022-10-27 Netflix, Inc. Adjustable light-emissive elements in a display wall
US11696653B2 (en) 2021-09-03 2023-07-11 Renande Alteon Crib
US20240036799A1 (en) * 2020-12-15 2024-02-01 Mankind Nv Method for displaying video images on screens and translation and rotation of the screens in a two-dimensional plane and its use
US11907462B1 (en) 2023-02-10 2024-02-20 Toshiba Global Commerce Solutions, Inc. Screen mapping and spatial arrangement system
US12026418B1 (en) * 2023-01-15 2024-07-02 Angelina Yejin Kim Collective display and intelligent layout system and associated processes to automatically update and collectively synchronize multiple device screens as a single collective graphical output image
US12061742B2 (en) * 2016-06-28 2024-08-13 Nikon Corporation Display device and control device
EP4451250A1 (en) * 2023-04-21 2024-10-23 X-Rite, Inc. System of calibrating a display
US12335579B2 (en) 2019-04-08 2025-06-17 Spirent Communications, Inc. Cloud gaming benchmark testing
US12431108B1 (en) 2024-03-26 2025-09-30 Toshiba Global Commerce Solutions, Inc. Display mapping methodology

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10572118B2 (en) * 2013-03-28 2020-02-25 David Michael Priest Pattern-based design system
KR102219861B1 (en) * 2014-05-23 2021-02-24 삼성전자주식회사 Method for sharing screen and electronic device thereof
US10853625B2 (en) 2015-03-21 2020-12-01 Mine One Gmbh Facial signature methods, systems and software
WO2019113215A1 (en) * 2017-12-05 2019-06-13 Mine One Gmbh Disparity cache
US11501406B2 (en) 2015-03-21 2022-11-15 Mine One Gmbh Disparity cache
US12322071B2 (en) 2015-03-21 2025-06-03 Mine One Gmbh Temporal de-noising
US12169944B2 (en) 2015-03-21 2024-12-17 Mine One Gmbh Image reconstruction for virtual 3D
WO2016154123A2 (en) 2015-03-21 2016-09-29 Mine One Gmbh Virtual 3d methods, systems and software
JP6640468B2 (en) * 2015-04-22 2020-02-05 Necディスプレイソリューションズ株式会社 Display system, display device, and adjustment method
WO2017115303A1 (en) * 2015-12-31 2017-07-06 Vstream Digital Media, Ltd Display arrangement utilizing internal display screen tower surrounded by external display screen sides
EP3479570A4 (en) * 2016-06-29 2020-01-22 INTEL Corporation VIDEO CODING AND DECODING
US10560755B2 (en) * 2016-09-23 2020-02-11 Verizon Patent And Licensing Inc. Methods and systems for concurrently transmitting object data by way of parallel network interfaces
US10972511B2 (en) * 2017-11-07 2021-04-06 Adobe Inc. Streaming relay for digital signage
US10372402B1 (en) 2018-03-27 2019-08-06 Panoscape Holdings, LLC Multi-panel, multi-communication video wall and system and method for seamlessly isolating one of more panels for individual user interaction
KR102545072B1 (en) * 2018-07-19 2023-06-19 삼성전자주식회사 System including a plurality of display apparatus and control method thereof
KR102564110B1 (en) 2018-07-20 2023-08-08 삼성전자주식회사 System including a plurality of display apparatus and control method thereof
US11886766B2 (en) 2018-08-28 2024-01-30 Panoscape Holdings, LLC Multi-panel, multi-communication video wall and system and method for seamlessly isolating one or more panels for individual user interaction
US12293125B2 (en) 2018-08-28 2025-05-06 PanoScape Holdings, LLC: Multi-panel, multi-communication video wall and system and method for seamlessly isolating one or more panels for individual user interaction
KR102604170B1 (en) 2018-09-27 2023-11-20 삼성전자주식회사 Electronic apparatus and the control method thereof
US11403987B2 (en) 2018-10-25 2022-08-02 Baylor University System and method for a multi-primary wide gamut color system
US11189210B2 (en) 2018-10-25 2021-11-30 Baylor University System and method for a multi-primary wide gamut color system
US11315467B1 (en) 2018-10-25 2022-04-26 Baylor University System and method for a multi-primary wide gamut color system
US11475819B2 (en) 2018-10-25 2022-10-18 Baylor University System and method for a multi-primary wide gamut color system
US11532261B1 (en) 2018-10-25 2022-12-20 Baylor University System and method for a multi-primary wide gamut color system
US11289003B2 (en) 2018-10-25 2022-03-29 Baylor University System and method for a multi-primary wide gamut color system
US10997896B2 (en) 2018-10-25 2021-05-04 Baylor University System and method for a six-primary wide gamut color system
US11043157B2 (en) 2018-10-25 2021-06-22 Baylor University System and method for a six-primary wide gamut color system
US11289000B2 (en) 2018-10-25 2022-03-29 Baylor University System and method for a multi-primary wide gamut color system
US12475826B2 (en) 2018-10-25 2025-11-18 Baylor University System and method for a multi-primary wide gamut color system
US12444334B2 (en) 2018-10-25 2025-10-14 Baylor University System and method for a multi-primary wide gamut color system
US12444337B2 (en) 2018-10-25 2025-10-14 Baylor University System and method for a multi-primary wide gamut color system
US11410593B2 (en) 2018-10-25 2022-08-09 Baylor University System and method for a multi-primary wide gamut color system
US11587491B1 (en) 2018-10-25 2023-02-21 Baylor University System and method for a multi-primary wide gamut color system
US10607527B1 (en) 2018-10-25 2020-03-31 Baylor University System and method for a six-primary wide gamut color system
US11488510B2 (en) 2018-10-25 2022-11-01 Baylor University System and method for a multi-primary wide gamut color system
US11037481B1 (en) 2018-10-25 2021-06-15 Baylor University System and method for a multi-primary wide gamut color system
US11030934B2 (en) 2018-10-25 2021-06-08 Baylor University System and method for a multi-primary wide gamut color system
US11069280B2 (en) 2018-10-25 2021-07-20 Baylor University System and method for a multi-primary wide gamut color system
US11055013B2 (en) * 2019-01-25 2021-07-06 International Business Machines Corporation Recovering from data loss using copy services relationships between volumes
US11544029B2 (en) * 2020-02-21 2023-01-03 Userful Corporation System and method for synchronized streaming of a video-wall
CN114554115B (en) * 2020-11-25 2024-07-19 中强光电股份有限公司 Image output apparatus and image output method
US12379889B2 (en) 2022-02-11 2025-08-05 Panoscape Holdings, LLC Multi-panel, multi-communication video wall and system and method for seamlessly isolating one or more panels for individual user interaction
CN114363583A (en) * 2022-03-17 2022-04-15 长沙金维信息技术有限公司 Linux-based embedded video monitoring system construction method
US12183306B2 (en) * 2022-11-15 2024-12-31 Ags Llc Method and system for controlling and synchronizing the display of content on multiple gaming machines and/or external displays
CN118363549A (en) * 2023-01-18 2024-07-19 苏州佳世达电通有限公司 Display system and operation method thereof
KR20240132871A (en) * 2023-02-27 2024-09-04 엘지전자 주식회사 Method for calibrating image display device and system for the same
US12462772B1 (en) 2024-06-20 2025-11-04 6P Color, Inc. System and method for conversion from XYZ to multiple primaries using pseudo white points

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030071832A1 (en) * 2001-10-11 2003-04-17 Branson Michael John Adjustable display device with display adjustment function and method therefor
US20100303311A1 (en) * 2009-05-26 2010-12-02 Union Community Co., Ltd. Fingerprint recognition apparatus and method thereof of acquiring fingerprint data
US20110013812A1 (en) * 2009-07-16 2011-01-20 Union Community Co., Ltd. Entrance control system having a camera and control method thereof
US20110069179A1 (en) * 2009-09-24 2011-03-24 Microsoft Corporation Network coordinated event capture and image storage
US20110299761A1 (en) * 2010-06-02 2011-12-08 Myokan Yoshihiro Image Processing Apparatus, Image Processing Method, and Program
US20120119977A1 (en) * 2009-07-24 2012-05-17 Megachips Corporation Display device
US20120249583A1 (en) * 2011-03-31 2012-10-04 Yoshihisa Nishiyama Display apparatus, display method, and computer-readable recording medium
US20130033379A1 (en) * 2011-08-05 2013-02-07 Jentoft Keith A Security monitoring system
US20130076865A1 (en) * 2010-06-18 2013-03-28 Canon Kabushiki Kaisha Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
US20130321477A1 (en) * 2012-06-01 2013-12-05 Pixtronix, Inc. Display devices and methods for generating images thereon according to a variable composite color replacement policy
US20140160101A1 (en) * 2012-12-10 2014-06-12 Hon Hai Precision Industry Co., Ltd. Display wall adjusting apparatus and method for adjusting display parameters of display screens of display wall
US20140159992A1 (en) * 2012-12-11 2014-06-12 Hon Hai Precision Industry Co., Ltd. Adjusting a display wall through a portable device
US20140193037A1 (en) * 2013-01-08 2014-07-10 John Fleck Stitzinger Displaying an Image on Multiple Dynamically Located Displays
US20140293017A1 (en) * 2013-03-29 2014-10-02 Panasonic Corporation Method of Automatically Forming One Three-Dimensional Space with Multiple Screens

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115022A (en) * 1996-12-10 2000-09-05 Metavision Corporation Method and apparatus for adjusting multiple projected raster images
US7158140B1 (en) * 1999-03-15 2007-01-02 Ati International Srl Method and apparatus for rendering an image in a video graphics adapter
EP1687732A4 (en) 2003-11-19 2008-11-19 Lucid Information Technology Ltd Method and system for multiple 3-d graphic pipeline over a pc bus
US7154500B2 (en) * 2004-04-20 2006-12-26 The Chinese University Of Hong Kong Block-based fragment filtration with feasible multi-GPU acceleration for real-time volume rendering on conventional personal computer
US8407347B2 (en) 2004-11-19 2013-03-26 Xiao Qian Zhang Method of operating multiple input and output devices through a single computer
TW200638753A (en) * 2005-04-19 2006-11-01 Asustek Comp Inc Display system and method for controlling rotation angle therefore
US7788412B2 (en) 2005-12-14 2010-08-31 Lenovo (Beijing) Limited Display system and method
US8994757B2 (en) * 2007-03-15 2015-03-31 Scalable Display Technologies, Inc. System and method for providing improved display quality by display adjustment and image processing using optical feedback
TW200948088A (en) 2008-02-27 2009-11-16 Ncomputing Inc System and method for virtual 3D graphics acceleration and streaming multiple different video streams
US8395631B1 (en) * 2009-04-30 2013-03-12 Nvidia Corporation Method and system for sharing memory between multiple graphics processing units in a computer system
US20100328447A1 (en) * 2009-06-26 2010-12-30 Sony Computer Entertainment, Inc. Configuration of display and audio parameters for computer graphics rendering system having multiple displays
US8954872B2 (en) * 2009-08-24 2015-02-10 Ati Technologies Ulc Method and apparatus for configuring a plurality of displays into a single large surface display
US20110267328A1 (en) * 2010-04-28 2011-11-03 Narayanan Venkatasubramanian Failsafe interconnect for tiled wall display
CN102036043A (en) 2010-12-15 2011-04-27 成都市华为赛门铁克科技有限公司 Video data processing method and device as well as video monitoring system
US8531474B2 (en) * 2011-11-11 2013-09-10 Sharp Laboratories Of America, Inc. Methods, systems and apparatus for jointly calibrating multiple displays in a display ensemble
TWI493438B (en) * 2012-01-09 2015-07-21 Amtran Technology Co Ltd Touch control method
CN102902502B (en) 2012-09-28 2015-06-17 威盛电子股份有限公司 Display system and display method suitable for display wall
US9035969B2 (en) 2012-11-29 2015-05-19 Seiko Epson Corporation Method for multiple projector display using a GPU frame buffer
KR20140070120A (en) * 2012-11-30 2014-06-10 삼성전자주식회사 display device color- calibration apparatus and method thereof
US10043234B2 (en) 2012-12-31 2018-08-07 Nvidia Corporation System and method for frame buffer decompression and/or compression
US20140184614A1 (en) 2013-01-03 2014-07-03 Ralink Technology Corp. Method and Apparatus for Image Capture in Transmitter of Wireless Communications System
US20150286456A1 (en) * 2014-01-11 2015-10-08 Userful Corporation Method and System of Video Wall Setup and Adjustment Using GUI and Display Images

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030071832A1 (en) * 2001-10-11 2003-04-17 Branson Michael John Adjustable display device with display adjustment function and method therefor
US20100303311A1 (en) * 2009-05-26 2010-12-02 Union Community Co., Ltd. Fingerprint recognition apparatus and method thereof of acquiring fingerprint data
US20110013812A1 (en) * 2009-07-16 2011-01-20 Union Community Co., Ltd. Entrance control system having a camera and control method thereof
US20120119977A1 (en) * 2009-07-24 2012-05-17 Megachips Corporation Display device
US20110069179A1 (en) * 2009-09-24 2011-03-24 Microsoft Corporation Network coordinated event capture and image storage
US20110299761A1 (en) * 2010-06-02 2011-12-08 Myokan Yoshihiro Image Processing Apparatus, Image Processing Method, and Program
US20130076865A1 (en) * 2010-06-18 2013-03-28 Canon Kabushiki Kaisha Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
US20120249583A1 (en) * 2011-03-31 2012-10-04 Yoshihisa Nishiyama Display apparatus, display method, and computer-readable recording medium
US20130033379A1 (en) * 2011-08-05 2013-02-07 Jentoft Keith A Security monitoring system
US20130321477A1 (en) * 2012-06-01 2013-12-05 Pixtronix, Inc. Display devices and methods for generating images thereon according to a variable composite color replacement policy
US20140160101A1 (en) * 2012-12-10 2014-06-12 Hon Hai Precision Industry Co., Ltd. Display wall adjusting apparatus and method for adjusting display parameters of display screens of display wall
US20140159992A1 (en) * 2012-12-11 2014-06-12 Hon Hai Precision Industry Co., Ltd. Adjusting a display wall through a portable device
US20140193037A1 (en) * 2013-01-08 2014-07-10 John Fleck Stitzinger Displaying an Image on Multiple Dynamically Located Displays
US20140293017A1 (en) * 2013-03-29 2014-10-02 Panasonic Corporation Method of Automatically Forming One Three-Dimensional Space with Multiple Screens

Cited By (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150049117A1 (en) * 2012-02-16 2015-02-19 Seiko Epson Corporation Projector and method of controlling projector
US20160162240A1 (en) * 2013-07-29 2016-06-09 Samsung Electronics Co., Ltd. Method and apparatus for constructing multi-screen display
US9911176B2 (en) * 2014-01-11 2018-03-06 Userful Corporation System and method of processing images into sub-image portions for output to a plurality of displays such as a network video wall
US20160165226A1 (en) * 2014-12-04 2016-06-09 Spirent Communications, Inc. Video streaming and video telephony downlink performance analysis system
US9591300B2 (en) * 2014-12-04 2017-03-07 Spirent Communications, Inc. Video streaming and video telephony downlink performance analysis system
US20160165229A1 (en) * 2014-12-05 2016-06-09 Aten International Co., Ltd. Calibration system and method for multi-display system
US11861539B2 (en) 2015-01-23 2024-01-02 Sprinklr, Inc. Multi-dimensional command center
US11062251B2 (en) 2015-01-23 2021-07-13 Sprinklr, Inc. Multi-dimensional command center
US10042419B2 (en) * 2015-01-29 2018-08-07 Electronics And Telecommunications Research Institute Method and apparatus for providing additional information of digital signage content on a mobile terminal using a server
US20160224300A1 (en) * 2015-01-29 2016-08-04 Electronics And Telecommunications Research Institute Method of providing additional information of content, mobile terminal and content control server
US20160277244A1 (en) * 2015-03-18 2016-09-22 ThePlatform, LLC. Methods And Systems For Content Presentation Optimization
US20170195611A1 (en) * 2016-01-05 2017-07-06 Samsung Electronics Co., Ltd. Display system, display apparatus, and controlling method thereof
US10432886B2 (en) * 2016-01-05 2019-10-01 Samsung Electronics Co., Ltd. Display system, display apparatus, and controlling method thereof
CN107534788A (en) * 2016-01-05 2018-01-02 三星电子株式会社 Display system, display device and control method thereof
US10778927B2 (en) * 2016-01-05 2020-09-15 Samsung Electronics Co., Ltd. Display system, display apparatus, and controlling method thereof
US10203928B2 (en) * 2016-01-08 2019-02-12 Boe Technology Group Co., Ltd. Display device, method and device for adjusting information channels thereof
US20180113664A1 (en) * 2016-01-08 2018-04-26 Boe Technology Group Co., Ltd. Display device, method and device for adjusting information channels thereof
TWI588738B (en) * 2016-02-05 2017-06-21 韌硬軟機電股份有限公司 Display system for an array of video displays
US10489099B2 (en) * 2016-03-04 2019-11-26 Boe Technology Group Co., Ltd. Spliced panel and method and device for automatically allocating content to be display on spliced panel
US10120635B2 (en) 2016-03-09 2018-11-06 Samsung Electronics Co., Ltd. Configuration and operation of display devices including device management
US11853635B2 (en) 2016-03-09 2023-12-26 Samsung Electronics Co., Ltd. Configuration and operation of display devices including content curation
KR102327207B1 (en) * 2016-03-09 2021-11-17 삼성전자주식회사 Configuration and operation of display devices including device management
WO2017155237A1 (en) * 2016-03-09 2017-09-14 Samsung Electronics Co., Ltd. Configuration and operation of display devices including device management
KR102346632B1 (en) * 2016-03-09 2022-01-03 삼성전자주식회사 Configuration and operation of display devices including content curation
WO2017155236A1 (en) * 2016-03-09 2017-09-14 Samsung Electronics Co., Ltd. Configuration and operation of display devices including content curation
EP3424225A4 (en) * 2016-03-09 2019-03-20 Samsung Electronics Co., Ltd. CONFIGURATION AND OPERATION OF DISPLAY DEVICES COMPRISING DEVICE MANAGEMENT
KR20170105445A (en) * 2016-03-09 2017-09-19 삼성전자주식회사 Configuration and operation of display devices including device management
KR20170105444A (en) * 2016-03-09 2017-09-19 삼성전자주식회사 Configuration and operation of display devices including content curation
EP3403414A4 (en) * 2016-03-09 2019-01-16 Samsung Electronics Co., Ltd. CONFIGURATION AND OPERATION OF DISPLAY DEVICES COMPRISING CONTENT CONSERVATION
US20170295349A1 (en) * 2016-04-11 2017-10-12 Seiko Epson Corporation Projection apparatus, projector, and projector controlling method
US10080002B2 (en) * 2016-04-11 2018-09-18 Seiko Epson Corporation Projection apparatus, projector, and projector controlling method
US20170337028A1 (en) * 2016-05-17 2017-11-23 Qisda Corporation Method and system for modular display frame
US12061742B2 (en) * 2016-06-28 2024-08-13 Nikon Corporation Display device and control device
US11347465B2 (en) * 2016-07-21 2022-05-31 Disney Enterprises, Inc. Display device array
US20180024800A1 (en) * 2016-07-21 2018-01-25 Disney Enterprises, Inc. Display device array
US10353660B2 (en) * 2016-09-15 2019-07-16 Panasonic Intellectual Property Management Co., Ltd. Image display system
US20180107358A1 (en) * 2016-10-17 2018-04-19 International Business Machines Corporation Multiple-display unification system and method
US20190244561A1 (en) * 2016-11-17 2019-08-08 Xi'an Novastar Tech Co., Ltd. Pixel-by-pixel calibration method
US10726776B2 (en) * 2016-11-17 2020-07-28 Xi'an Novastar Tech Co., Ltd. Pixel-by-pixel calibration method
KR102665125B1 (en) 2016-12-21 2024-05-13 삼성전자주식회사 Display apparatus, electronic apparatus, display system comprising display appartus and electronic apparatus, and control methods thereof
KR20180072337A (en) * 2016-12-21 2018-06-29 삼성전자주식회사 Display apparatus, electronic apparatus, display system comprising display appartus and electronic apparatus, and control methods thereof
KR102681849B1 (en) * 2016-12-23 2024-07-04 삼성전자주식회사 Electronic device and Method for controlling the electronic device thereof
KR20180074405A (en) * 2016-12-23 2018-07-03 삼성전자주식회사 Electronic device and Method for controlling the electronic device thereof
US11494783B2 (en) * 2017-01-18 2022-11-08 International Business Machines Corporation Display and shelf space audit system
US20180204219A1 (en) * 2017-01-18 2018-07-19 International Business Machines Corporation Display and shelf space audit system
US10942697B2 (en) * 2017-03-07 2021-03-09 Sprinklr, Inc. System for discovering configuration of display wall
US20180260185A1 (en) * 2017-03-07 2018-09-13 Sprinklr, Inc. System for discovering configuration of display wall
US10445047B2 (en) * 2017-04-19 2019-10-15 International Business Machines Corporation Automatic real-time configuration of a multi-head display system
US20180307455A1 (en) * 2017-04-19 2018-10-25 International Business Machines Corporation Automatic real-time configuration of a multi-head display system
US10365876B2 (en) * 2017-04-19 2019-07-30 International Business Machines Corporation Automatic real-time configuration of a multi-head display system
US10394512B2 (en) * 2017-07-06 2019-08-27 Amzetta Technologies, Llc Multi-monitor alignment on a thin client
EP3659026B1 (en) * 2017-07-26 2025-03-19 Barco N.V. Calibration method and system for tiled displays
US11074028B2 (en) * 2017-07-26 2021-07-27 Barco N.V. Calibration method and system for tiled displays
US11301124B2 (en) 2017-08-18 2022-04-12 Microsoft Technology Licensing, Llc User interface modification using preview panel
US11237699B2 (en) 2017-08-18 2022-02-01 Microsoft Technology Licensing, Llc Proximal menu generation
US10417991B2 (en) * 2017-08-18 2019-09-17 Microsoft Technology Licensing, Llc Multi-display device user interface modification
US11163516B2 (en) * 2017-11-09 2021-11-02 Samsung Electronics Co., Ltd. Electronic apparatus, display apparatus, and multivision setting method
US20190278551A1 (en) * 2018-03-06 2019-09-12 Silicon Video Systems, Inc. Variable layout module
US20190369941A1 (en) * 2018-06-05 2019-12-05 Videri Inc. Systems and methods for mapping an orchestrated digital display system
US11467795B2 (en) * 2018-06-05 2022-10-11 Videri Inc. Systems and methods for mapping an orchestrated digital display system
US11244363B1 (en) 2018-10-25 2022-02-08 Sprinklr, Inc. Rating and review integration system
EP3830682A4 (en) * 2018-10-26 2021-11-17 Samsung Electronics Co., Ltd. ELECTRONIC DEVICE AND ITS CONTROL PROCESS
WO2020085666A1 (en) 2018-10-26 2020-04-30 Samsung Electronics Co., Ltd. Electronic device and control method thereof
US11393411B2 (en) * 2018-12-07 2022-07-19 Sharp Nec Display Solutions, Ltd. Multi-display system and method for adjusting multi-display system
CN111385521A (en) * 2018-12-27 2020-07-07 浙江宇视科技有限公司 A method and decoding device for distributed display of user interface
WO2020159185A1 (en) * 2019-01-31 2020-08-06 Samsung Electronics Co., Ltd. Electronic device and control method thereof
US11150858B2 (en) 2019-01-31 2021-10-19 Samsung Electronics Co., Ltd. Electronic devices sharing image quality information and control method thereof
CN111694528A (en) * 2019-03-12 2020-09-22 纬创资通股份有限公司 Method for identifying typesetting of display wall and electronic device using same
US12192591B2 (en) 2019-04-08 2025-01-07 Spirent Communications, Inc. Training an encrypted video stream network scoring system with non-reference video scores
US11405695B2 (en) 2019-04-08 2022-08-02 Spirent Communications, Inc. Training an encrypted video stream network scoring system with non-reference video scores
US12335579B2 (en) 2019-04-08 2025-06-17 Spirent Communications, Inc. Cloud gaming benchmark testing
WO2020232537A1 (en) * 2019-05-17 2020-11-26 Fenoto Technologies Inc. Electronic paper display system
US11386178B2 (en) 2019-06-20 2022-07-12 Sprinklr, Inc. Enhanced notification system for real time control center
CN112433688A (en) * 2019-08-26 2021-03-02 杭州海康威视数字技术股份有限公司 Image display method and device and spliced screen
WO2021054511A1 (en) * 2019-09-17 2021-03-25 삼성전자주식회사 Electronic device and control method therefor
US11397923B1 (en) 2019-10-07 2022-07-26 Sprinklr, Inc. Dynamically adaptive organization mapping system
US12106697B2 (en) 2020-02-27 2024-10-01 Atmoph Inc. Image display device, system and method
CN115039412A (en) * 2020-02-27 2022-09-09 爱幕赴株式会社 Image display apparatus, system and method
RU2748176C1 (en) * 2020-05-18 2021-05-20 Общество с ограниченной ответственностью "Ай Ти Ви групп" System and method of adjustment of data display on video wall
CN111694534A (en) * 2020-06-15 2020-09-22 京东方科技集团股份有限公司 Method and device for acquiring display unit information of spliced screen
WO2021254324A1 (en) * 2020-06-15 2021-12-23 京东方科技集团股份有限公司 Method and apparatus for acquiring display element information of tiled screen
US11620965B2 (en) * 2020-08-19 2023-04-04 Beijing Institute Of Technology Video display method, video display system, electronic device, and storage medium
US20220059053A1 (en) * 2020-08-19 2022-02-24 Beijing Institute Of Technology Video display method, video display system, electronic device, and storage medium
TWI740623B (en) * 2020-08-26 2021-09-21 財團法人資訊工業策進會 Apparatus, method, and computer program product thereof for integrating videos
US20240036799A1 (en) * 2020-12-15 2024-02-01 Mankind Nv Method for displaying video images on screens and translation and rotation of the screens in a two-dimensional plane and its use
RU205445U1 (en) * 2021-03-18 2021-07-14 Общество с ограниченной ответственностью «Научно-Технический Центр ПРОТЕЙ» (ООО «НТЦ ПРОТЕЙ») Distributed Controller Video Wall
US11694604B2 (en) * 2021-04-23 2023-07-04 Netflix, Inc. Adjustable light-emissive elements in a display wall
US20220343834A1 (en) * 2021-04-23 2022-10-27 Netflix, Inc. Adjustable light-emissive elements in a display wall
US11696653B2 (en) 2021-09-03 2023-07-11 Renande Alteon Crib
US12446710B2 (en) 2021-09-03 2025-10-21 Renande Alteon Crib
US20220012000A1 (en) * 2021-09-24 2022-01-13 Intel Corporation Visually assisted multi-display configuration
WO2023048860A1 (en) * 2021-09-24 2023-03-30 Intel Corporation Visually assisted multi-display configuration
US12026418B1 (en) * 2023-01-15 2024-07-02 Angelina Yejin Kim Collective display and intelligent layout system and associated processes to automatically update and collectively synchronize multiple device screens as a single collective graphical output image
US11907462B1 (en) 2023-02-10 2024-02-20 Toshiba Global Commerce Solutions, Inc. Screen mapping and spatial arrangement system
EP4451250A1 (en) * 2023-04-21 2024-10-23 X-Rite, Inc. System of calibrating a display
US12431108B1 (en) 2024-03-26 2025-09-30 Toshiba Global Commerce Solutions, Inc. Display mapping methodology

Also Published As

Publication number Publication date
US9911176B2 (en) 2018-03-06
US20150286456A1 (en) 2015-10-08
US20160203579A1 (en) 2016-07-14

Similar Documents

Publication Publication Date Title
US20150279037A1 (en) System and Method of Video Wall Setup and Adjustment Using Automated Image Analysis
US8872924B1 (en) Imaging based auto display grid configuration system and method
US10706532B1 (en) Digital projection system for workpiece assembly and associated method
JP6911149B2 (en) Screen state auto-detection robots, methods and computer readable storage media
JP6524619B2 (en) Locus drawing apparatus, locus drawing method, locus drawing system, and program
AU2013246454B2 (en) Identifying and configuring controls on a control panel
US9906762B2 (en) Communication apparatus, method of controlling communication apparatus, non-transitory computer-readable storage medium
US10417742B2 (en) System and apparatus for editing preview images
EP2843625A1 (en) Method for synthesizing images and electronic device thereof
JP6423650B2 (en) Projector clustering method, management apparatus and management system using the same
CN113474750A (en) Screen display control method, device and system
CN105975236B (en) Automatic positioning method, automatic positioning display system and display device
CN110099224B (en) Pre-monitoring display method, device and system, computer equipment and storage medium
US20180033361A1 (en) Method and system for calibrating a display screen
CN104057455A (en) Robot system
US20120013523A1 (en) Sensor Driven Automatic Display Configuration System And Method
CN105959824B (en) Method and device for debugging reproduction rate of display equipment
TWI583967B (en) Detection system and method of stereoscopic display device.
CN113034585B (en) Offset state test method, test equipment and storage medium
US9881192B2 (en) Systems and methods for electronically pairing devices
CN105979152A (en) Smart shooting system
US10609305B2 (en) Electronic apparatus and operating method thereof
EP3525566A1 (en) Substrate inspection device and substrate distortion compensating method using same
CN115543247A (en) Screen configuration method and device, electronic equipment and readable storage medium
CN109816649A (en) A kind of method for testing software and software testing device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION