[go: up one dir, main page]

WO2017131723A1 - Generating a test case for a recorded stream of events - Google Patents

Generating a test case for a recorded stream of events Download PDF

Info

Publication number
WO2017131723A1
WO2017131723A1 PCT/US2016/015512 US2016015512W WO2017131723A1 WO 2017131723 A1 WO2017131723 A1 WO 2017131723A1 US 2016015512 W US2016015512 W US 2016015512W WO 2017131723 A1 WO2017131723 A1 WO 2017131723A1
Authority
WO
WIPO (PCT)
Prior art keywords
events
stream
sequence
time
recorded
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2016/015512
Other languages
French (fr)
Inventor
Inbar SHANI
Olga KOGAN
Ilan Shufer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Enterprise Development LP
Original Assignee
Hewlett Packard Enterprise Development LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Enterprise Development LP filed Critical Hewlett Packard Enterprise Development LP
Priority to PCT/US2016/015512 priority Critical patent/WO2017131723A1/en
Publication of WO2017131723A1 publication Critical patent/WO2017131723A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition

Definitions

  • Figure 4 is a block diagram of an example method for generating a test case for a recorded stream of events, consistent with the present disclosure. Detailed Description
  • Manual testing may include creation of a video recording of the test, which may be viewed at a later time. These video recordings may be viewed later to assist in reproducing the test; however, they are often not able to be searched at a later date unless manual tagging of the video has taken place.
  • testing of a system displaying a virtual tennis game may include adding "low backhand" as a tag to a video recording of the simulated tennis game.
  • recording initiation instructions 320 when executed by a processor, such as processor 316, may cause system 300 to begin recording an HCi.
  • recording initiation instructions 320 may instruct the system 300 to initiate recording by the camera array.
  • Recording initiation instructions 320 may further instruct the system 300 to begin recording instructions from the instruction recorder.
  • recording initiation instructions 320 may be relayed to individual components via the controller. In other examples, recording initiation instructions 320 may be relayed to individual components separately.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

Example implementations relate to generation of a test case for a recorded stream of events. For example, generation of a test case for a recorded stream of events may include a camera array including a plurality of multidimensional sensors to record a stream of events, wherein the stream of events occurs within an interactive computing environment. The system may further include an instruction recorder to record a sequence of logical events associated with the stream of events, a video analyzer to analyze the recorded stream of events and produce a time-based transcript of the stream of events, and a controller to correlate inputs from each of the camera array, the instruction recorder, and the video analyzer. The system, using the controller, may generate a test case for verifying the recorded stream of events based on the inputs.

Description

GENERATING A TEST CASE FOR A RECORDED STREAM OF EVENTS
Background
[0001] Mobile applications introduced new concepts of Human-Computer Interaction (HC!) such as touch screen, gestures, voice recognition and face scans. It is possible that these directions will continue evolving, especially in new types of applications which are embedded in specific types of hardware. For example, gaming consoles may use body gestures and a dedicated remote controller to interact with the software. Future applications may use human movement as well as specific hardware elements to combine for a very complex HCI.
Brief Description of the Drawings
[0002] Figure 1 is a block diagram of an example system for generating a test case for a recorded stream of events, consistent with the present disclosure.
[0003] Figure 2 is a block diagram of an example system for generating a test case for a recorded stream of events, consistent with the present disclosure.
[0004] Figure 3 is a block diagram of an example system for generating a test case for a recorded stream of events, consistent with the present disclosure.
[0005] Figure 4 is a block diagram of an example method for generating a test case for a recorded stream of events, consistent with the present disclosure. Detailed Description
[0008] Embedded software may be tested with simulated hardware and scripted Input/Output (I/O) events. However because more complex interactions between a human and a computing device may include descriptions of the interactions, I/O event sequences may become too complex to simulate.
Additionally, simulated I/O event sequences may not accurately describe the interaction from the perspective of the human interacting with the computing device. As a result, manual testing, input, and replication may be used to fully capture a Human-Computer Interaction (HCi).
[0007] User perspective testing on other embedded systems may be done manually or through hardware simulation, with scripts being created off of requirements. Manual testing may include creation of a video recording of the test, which may be viewed at a later time. These video recordings may be viewed later to assist in reproducing the test; however, they are often not able to be searched at a later date unless manual tagging of the video has taken place. For example, testing of a system displaying a virtual tennis game may include adding "low backhand" as a tag to a video recording of the simulated tennis game.
[0008] In contrast, generating a test case for a recorded stream of events consistent with the present disclosure allows for generation of a test case describing an HCI sequence based on a recording of the interaction. Generating a test case for a recorded stream of events consistent with the present disclosure may include the use of three-dimensional (3D) sensing, video analysis, physical object recognition and/or a built-in physical objects dictionary to 'record' a user interacting in a HCi, produce a script of the interaction, referred to herein as a "test case", and produce a coverage report based on one or more such recording. The test case may be used as an asset to store and manage in a Quality Management (QM) system, and to reproduce the test as needed by others, such as a developer trying to fix a problem discovered by the test. The coverage report may be used to determine how much of the application has been tested, or suggest possible HCI combinations that should be tested. Furthermore, as a video recording may be analyzed for particular movements, objects, and physical elements within a test environment, a test case generated in response to the video recording may be searchable without requiring comprehensive manual tagging and input.
[0009] Figure 1 is a block diagram of an example system 100 for generating a test case for a recorded stream of events, consistent with the present disclosure. System 100 may include a number of components, as illustrated in Figure 1.
System 00 may include a camera array 102. As used herein, a camera array refers to a plurality of multidimensional sensors that are capable of recording a stream of events. In some examples, the camera array 102 may be comprised of 3D sensors, although examples are not so limited and the camera array 102 may use other types of sensors. In some examples, camera array 102 may include moveable sensors. Put another way, camera array 102 may include sensors whose position may be altered consistent with the requirements of system 100. For example, in a system displaying a virtual tennis game, sensors in camera array 102 may be positioned to capture a front view of the player. As illustrated in Figure 1 , camera array 102 may be coupled to a controller 1 10. As used herein, a controller refers to a hardware component that may facilitate interaction between camera array 102 and other components of system 100.
[0010] Further, as illustrated in Figure 1 , the system 100 may include an instruction recorder 104. As used herein, an instruction recorder refers to a hardware component which records a sequence of logical events associated with the stream of events. As used herein, a logical event refers to a plurality of ordered instructions that collectively define a particular computing response. For example, if the stream of events includes "hitting" a virtual tennis ball, the associated logical event would include an ordered list of instructions executable by a processor to display a tennis ball being hit in response to a specified set of conditions. The instruction recorder 104 may be coupled to the camera array 102 via a wireless or a wired connection, instruction recorder 104 may further execute instructions associated with the stream of events. For example, once the stream of events has recorded that a virtual tennis bail has been hit, instruction recorder 104 may execute instructions directing that the ball be returned to the player. In such cases, controller 1 10 may direct instruction recorder 104 to execute the instructions at a pre-determined time or in response to a pre-determined event.
[0011] In some examples, the system 100 may include a video analyzer 106. As used herein, a video analyzer refers to a hardware component that analyzes a video recorded stream of events. For instance, video analyzer 106 may analyze a stream of events recorded by camera array 102. In some examples, video analyzer 106 may analyze the stream of events to search for physical objects and user movement. For example, in a simulated game of tennis, video analyzer 106 may search for strokes used by a player, such as a forehand, backhand, or serve. Video analyzer 106 may further generate a transcript of the analyzed stream of events. As used herein, a transcript refers to a written account of a series of events. The transcript generated by video analyzer 106 may be time-based. Put another way, video analyzer 106 may construct a descriptive transcript of the recorded stream of events where the description includes the times at which individual events occurred and/or the intervals between events. For example, in a simulated tennis game, video analyzer 106 may generate a time-based transcript of the match, noting when the game was started, when various strokes were used, and when the game ended.
[0012] In some examples, the system 100 may include a visualization player 108. As used herein, visualization player 108 refers to a component within system 100 that generates a visual display of the time-based transcript generated by video analyzer 106. Put another way, visualization player 108 allows for display of the recorded stream of events correlated with a particular time-based transcript.
[0013] The system 100 may further include a controller 1 10. As described further herein, the controller 1 10 may perform a number of functions to generate a test case for a recorded stream of events. For instance, as illustrated in Figure 1 , controller 1 10 may be coupled to the various components of system 100. As such, controller 1 10 may instruct any of the various components to initiate or stop a recording. Controller 1 10 may further receive the inputs of an individual component. Put another way, controller 1 10 may receive the video recorded stream of events from camera array 102, the recorded sequence of logical events from instruction recorder 104, the analysis of the recorded stream of events from video analyzer 106, and/or the visual display of the time-based transcript from visualization player 108.
[0014] Controller 1 10 may further correlate received inputs. As used herein, correlation refers to temporally aligning the time-based transcript, sequence of logical events, and stream of events, and merging the aligned time-based transcript, sequence of logical events, and stream of events into a single test case. In some cases, controller 1 10 may generate a test case corresponding to the received inputs. As used herein, a test case refers to a combination of a video recording and transcript that together describe a stream of events. [0015] Figure 2 is a block diagram of an example system 200 for generating a test case for a recorded stream of events, consistent with the present disclosure. System 200 may include a number of components, as illustrated in Figure 2.
[0016] System 200 may include a camera array 202. Camera array 202 is analogous to camera array 102 shown in Figure 1. In some examples, the camera array 202 may be comprised of 3D sensors, although examples are not so limited and the camera array 202 may use other types of sensors, in some examples, camera array 202 may include moveable sensors. Camera array 202 may monitor test environment 212. Put another way, camera array 202 may record and/or monitor sequences of events occurring within test environment 212. As used herein, a test environment refers to the setting of a Human-Computer Interaction (HCi). A test environment may comprise a physical location within which a user may interact with system 200. However, examples are not so limited and test environment 212 may include any setting where HCI may occur.
[0017] Further, as illustrated in Figure 2, the system 200 may include an instruction recorder 204. Instruction recorder 204 is analogous to instruction recorder 104, shown in Figure 1. As described above, instruction recorder may execute instructions associated with a stream of events, in such cases, controller 210 may direct instruction recorder 204 to execute components at a predetermined time or in response to a pre-determined event. For example, in a simulated game of tennis, controller 210 may direct instruction recorder 204 to execute a forehand swing in response to a serve. Similarly, controller 210 may direct instruction recorder 204 to not execute a swing once five volleys have been completed. As illustrated in Figure 2, instruction recorder 204 may be located within test environment 212, [0018] System 200 may further include a video analyzer 206, Video analyzer 206 is analogous to video analyzer 106, shown in Figure 1 Video analyzer 206 may analyze a stream of events recorded by camera array 202. In some examples, video analyzer 206 may analyze the stream of events to search for physical objects and user movement. Video analyzer 206 may further generate a transcript of the analyzed stream of events. As previously discussed, a transcript refers to a written account of the stream of events. The transcript may be time- based. Put another way, video analyzer 206 may construct a descriptive transcript of the recorded stream of events where the description includes the times at which individual events occurred and/or the intervals between events.
[0019] In some examples, video analyzer 206 may be coupled to a physical object directory 216. As used herein, a physical object directory refers to a database of physical elements that may be contained within an HCI environment. Physical object directory 2 6 may include descriptions and attributes of individual physical elements. During analysis, video analyzer 206 may determine that a physical object exists within the HCI environment. Video analyzer 206 may search physical object directory 216 to locate the detected physical object. For example, video analyzer 206 may detect a square object within the HCI environment. Upon detection of the square object, video analyzer 206 may connect with the physical object directory 216 to search for the object based on its attributes, such as its square shape. Physical object directory 216 may allow its database to be searched until the square-shaped physical object detected by video analyzer 206 is determined to be a table. Once the physical object is identified, video analyzer 206 may use the identification to virtually tag the object as a square table.
[0020] In some examples, video analyzer 206 may use a physical tag 214 located within test environment 212 to identify a physical object. As used herein, a physical tag refers to a material tag located within a test environment to identify a particular physical object or thing within the environment. For example, physical tag 214 may identify an object located within test environment 212 that is not included as part of physical object directory 216. In some examples, physical tag 214 may be used to identify specialized objects that are unique to a particular HCI.
[0021] The system 200 may further include a controller 210. Controller 210, shown in Figure 2, is analogous to controller 1 10, shown in Figure 1. Controller 210 may be coupled to other components of system 200, including camera array 202, instruction recorder 204, video analyzer 206, and/or visualization player 208. As such, controller 210 may instruct an individual component to initiate and/or stop a recording. Controller 210 may further receive the inputs of an individual component. Put another way, controller 2 0 may receive the video recorded stream of events from camera array 202, the recorded sequence of logical events from instruction recorder 204, the analysis of the recorded stream of events from video analyzer 206, and/or the visual display of the time-based transcript from visualization player 208.
[0022] Controller 210 may further correlate received inputs. As described herein, correlation may include temporally aligning the time-based transcript, the sequence of logical events, and the stream of events received, and merging the aligned time-based transcript, sequence of logical events, and stream of events into a single test case. Controller 210 may generate a test case corresponding to the received inputs. Put another way, controller 210 may combine video input received from camera array 202 and analyzed by video analyzer 206 with logic transcripts from instruction recorder 204 to create a test case.
[0023] In some examples, the system 200 may include a visualization player 208. Visualization player 208 is analogous to visualization player 108 shown in Figure 1. Visualization player 208 may display the recorded stream of events in correlation with a particular time-based transcript. Visualization player 208 may further allow for editing of a series of inputs correlated by controller 210. For instance, visualization player 208 may allow sections of a recorded stream of events that are not relevant to the final test case to be deleted. Visualization player 208 may also allow fine-tuning of the correlation performed by controller 210. Put another way, visualization player 208 may enable a user to slightly alter the combined inputs to more accurately reflect the HCi that occurred.
[0024] Figure 3 is a block diagram of an example system 300 for generating a test case for a recorded stream of events, consistent with the present disclosure. System 300 may include at least one computing device that is capable of communicating with at least one remote system. In the example of Figure 3, system 300 includes a processor 316 and a machine-readable storage medium 318.
Although the following descriptions refer to a single processor and a single machine- readable storage medium, the descriptions may also apply to a system with multiple processors and multiple machine-readable storage mediums. In such examples, the instructions may be distributed (e.g., stored) across multiple machine-readable storage mediums and the instructions may be distributed (e.g., executed by) across multiple processors.
[0025] Processor 316 may be one or more central processing units (CPUs), microprocessors, and/, or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 318. In the particular example shown in Figure 3, processor 316 may receive, determine, and send instructions 320, 322, 324, 326, and 328 for generating a test case. As an alternative or in addition to retrieving and executing instructions, processor 316 may include one or more electronic circuits comprising a number of electronic components for performing the functionality of one or more of the instructions in machine-readable storage medium 318, With respect to the executable instruction representations (e.g., boxes) described and shown herein, it should be understood that part or ail of the executable instructions and/or electronic circuits included within one box may, in alternate examples, be included in a different box shown in the figures or in a different box not shown.
[0026] Machine-readable storage medium 318 may be any electronic, magnetic, optical, or other physical storage device that stores executable
instructions. Thus, machine-readable storage medium 318 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Readonly Memory (EEPROM), a storage drive, an optical disc, and the like. Machine- readable storage medium 318 may be disposed within system 300, as shown in Figure 3. In this situation, the executable instructions may be "installed" on the system 300. Additionally and/or alternatively, machine-readable storage medium 318 may be a portable, external or remote storage medium, for example, that allows system 300 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an "installation package". As described herein, machine-readable storage medium 318 may be encoded with executable instructions for generating a test case for a recorded stream of events.
[0027] Referring to Figure 3, recording initiation instructions 320, when executed by a processor, such as processor 316, may cause system 300 to begin recording an HCi. For example, recording initiation instructions 320 may instruct the system 300 to initiate recording by the camera array. Recording initiation instructions 320 may further instruct the system 300 to begin recording instructions from the instruction recorder. In some examples, recording initiation instructions 320 may be relayed to individual components via the controller. In other examples, recording initiation instructions 320 may be relayed to individual components separately.
[0028] Event capture instructions 322, when executed by a processor, such as processor 316, may cause system 300 to capture the stream of events that are performed within an HCI. For example, event capture instructions 322 may instruct system 300 to capture a series of movements, wherein the movements are a result of an HCI. Event capture instructions 322 may further instruct system 300 to capture the stream of events in a particular format, such as video recording, or in a plurality of formats, such as video recording and instruction recording.
[0029] Receiving instructions 324, when executed by a processor, such as processor 316, may cause system 300 to receive the recorded stream of events and a sequence of logical events associated with the stream of events. Receiving instructions 324 may cause system 300 to receive the stream of events and sequence of logical events. In some examples, system 300 may receive the stream of events and sequence of logical events by the controller. Receiving instructions 324 may further cause system 300 to receive a time-based transcript of events based on the recorded stream of events. In some examples, receiving instructions 324 may instruct processor 316 to generate the time-based transcript of events. The time-based transcript of events may be generated by identifying a plurality of physical identification tags within a test environment, correlating each physical identification tag with a respective object, identifying physical objects without a physical identification tag by comparing the physical objects identified within the test environment with attributes of objects in the physical objects directory, and transcribing the recorded motions. [0030] Correlation instructions 326, when executed by a processor, such as processor 316, may cause system 300 to correlate the recorded stream of events, the sequence of logical events, and a time-based transcript of events. Put another way, correlation instructions 326 may cause system 300 to combine the recorded stream of events, the sequence of logical events, and a time-based transcript of events. In some examples, correlation instructions 326 may instruct the controller to perform the correlation.
[0031] Test case generation instructions 328, when executed by a processor, such as processor 316, may cause system 300 to generate a test case of the HCI. In some examples, test case generation instructions 328 may instruct system 300 to generate a test case that may be used to verify the recorded stream of events based on the correlation performed in response to correlation instructions 326. In some examples, test case generation instructions 328 may further instruct system 300 to present an editing interface for editing and searching the generated test case.
[0032] Figure 4 illustrates an example method 430 for generating a test case consistent with the present disclosure. At 432, method 430 may include initiating a recording. The initiation of a recording 432 may include initiating recording by an instruction recorder and/or a camera array, as discussed in relation to Figures 1 -3.
[0033] At 434, method 430 may include recording a stream of events. As described above, the stream of events may be captured and recorded by a camera array, in some examples, recording a stream of events may include capturing physical objects located within the test environment in which the stream of events is occurring.
[0034] At 436, method 430 may include recording a sequence of logical events. As described above, the sequence of logical events may be recorded by an instruction recorder. In some examples, the stream of events and the sequence of logical events may be recorded simultaneously, such that the sequence of logical events corresponds to the stream of events.
[0035] At 438, method 430 may include generating a time-based transcript. The time-based transcript may correspond to the recorded stream of events and/or the recorded sequence of logical events. In some examples, the time-based transcript may be generated by a video analyzer. The video analyzer may analyze the stream of events recorded at 434 to generate the time-based transcript, in some examples, analysis of the stream of events by the video analyzer may include locating physical identification tags within the test environment recorded as part of the stream of events at 434.
[0036] At 440, method 430 may include receiving the time-based transcript, the sequence of logical events, and the stream of events. In some examples, the time-based transcript, the sequence of logical events, and the stream of events may be received by the controller. The controller may further correlate the time- based transcript, sequence of logical events, and stream of events received at 440. In some examples, correlation may include temporally aligning the time-based transcript, sequence of logical events, and stream of events received at 440 and merging the aligned time-based transcript, sequence of logical events, and stream of events into a single test case.
[0037] in the foregoing detailed description of the present disclosure, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration how examples of the disclosure may be practiced. These examples are described in sufficient detail to enable those of ordinary skill in the art to practice the examples of this disclosure, and it is to be understood that other examples may be utilized and that process, electrical, and/or structural changes may be made without departing from the scope of the present disclosure,
[0038] The figures herein follow a numbering convention in which the first digit corresponds to the drawing figure number and the remaining digits identify an element or component in the drawing. Elements shown in the various figures herein may be added, exchanged, and/or eliminated so as to provide a number of additional examples of the present disclosure. In addition, the proportion and the relative scale of the elements provided in the figures are intended to illustrate the examples of the present disclosure, and should not be taken in a limiting sense. Further, as used herein, "a number of an element and/or feature can refer to one or more of such elements and/or features.
[0039] As used herein, "logic" is an alternative or additional processing resource to perform a particular action and/or function, etc., described herein, which includes hardware, e.g., various forms of transistor logic, application specific integrated circuits (ASICs), etc., as opposed to computer executable instructions, e.g., software firmware, etc., stored in memory and executable by a processor.

Claims

What is claimed:
1. A system to generate a test case for a recorded stream of events, comprising:
a camera array including a plurality of multidimensional sensors to record a stream of events, wherein the stream of events occurs within an interactive computing environment;
an instruction recorder to record a sequence of logical events associated with the stream of events;
a video analyzer to analyze the recorded stream of events and produce a time-based transcript of the stream of events; and
a controller to correlate inputs from each of the camera array, the instruction recorder, and the video analyzer and generate a test case for verifying the recorded stream of events based on the inputs
2. The system of claim , wherein the video analyzer analyzes the recorded stream of events for objects located within the interactive computing environment.
3. The system of claim 2, wherein the video analyzer uses a physical identification tag to locate an object within the interactive computing environment.
4. The system of claim 1 , further comprising a visualization player to cause the generation of a visual display of the time-based transcript of events.
5. The system of claim 1 , wherein the controller generates the test case by correlating the time-based transcript from the video analyzer, the sequence of logical events recorded by the instruction recorder, and the stream of events recorded by the camera array.
6. A non-transitory machine-readable medium storing instructions executable by a processor to:
initiate a recording within an interactive computing environment using a system controller, wherein the system controller is coupled to a camera array and an instruction recorder;
capture a stream of events using the camera array and the instruction recorder;
receive, by the controller, the stream of events and a sequence of logical events associated with the stream of events, wherein the sequence of logical events are received from an instruction recorder;
correlate, using the controller, the stream of events, the sequence of logical events, and a time-based transcript of events; and
generate a test case for verifying the recorded stream of events based on the correlation.
7. The non-transitory machine-readable medium of claim 6, further comprising instructions executable by the processor to generate the time-based transcript of events by analyzing the captured stream of events and identifying objects within the interactive computing environment.
8. The non-transitory machine-readable medium of claim 6, further comprising instructions executable by the processor to generate the time-based transcript of events by: identifying a plurality of physical identification tags within the interactive computing environment; and
correlating each of the plurality of physical identification tags with a different respective object identified within the interactive computing environment.
9. The non-transitory machine-readable medium of claim 6, further comprising instructions executable by the processor to generate the time-based transcript of events by transcribing a series of recorded motions within the stream of events.
10. The non-transitory machine-readable medium of claim 6, further comprising instructions executable by the processor to generate the time-based transcript of events by comparing objects in the stream of events with objects in a physical objects directory, wherein the physical objects directory contains descriptions and attributes of a plurality of physical objects.
1 . The non-transitory machine-readable medium of claim 6, further comprising instructions executable by the processor to present an editing interface to edit and search the generated test case.
12. A method for generating a test case for a recorded stream of events, comprising:
initiating a recording using a system controller, wherein the system controller is coupled to a camera array and an instruction recorder;
recording a stream of events within an interactive computing environment, using the camera array; recording a sequence of logical events associated with the stream of events using an instruction recorder;
generating a time-based transcript of events associated with the stream of events using a video analyzer; and
receiving, by the controller, the stream of events, the sequence of logical events, and the time-based transcript of events.
13. The method of claim 12, further comprising:
analyzing the stream of events using the video analyzer in response to receipt, by the controller, of the recorded stream of events and the sequence of logical events, wherein analyzing the stream of events includes locating a physical identification tag associated with an object within the interactive computing environment. 4. The method of claim 12, further comprising:
correlating, using the controller, the time-based transcript of events, the sequence of logical events, and the stream of events; and
generating a test case for verifying the recorded stream of events by correlating the stream of events, the sequence of logical events, and the time- based transcript of events.
15. The method of claim 14, wherein correlating the stream of events, the sequence of logical events, and the time-based transcript of events comprises: temporally aligning the time-based transcript of events, the sequence of logical events, and the video stream of events; and merging the stream of events, the sequence of logical events, and the time- based transcript of events into the test case.
PCT/US2016/015512 2016-01-29 2016-01-29 Generating a test case for a recorded stream of events Ceased WO2017131723A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/015512 WO2017131723A1 (en) 2016-01-29 2016-01-29 Generating a test case for a recorded stream of events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/015512 WO2017131723A1 (en) 2016-01-29 2016-01-29 Generating a test case for a recorded stream of events

Publications (1)

Publication Number Publication Date
WO2017131723A1 true WO2017131723A1 (en) 2017-08-03

Family

ID=59398456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/015512 Ceased WO2017131723A1 (en) 2016-01-29 2016-01-29 Generating a test case for a recorded stream of events

Country Status (1)

Country Link
WO (1) WO2017131723A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US20120042326A1 (en) * 2010-08-16 2012-02-16 Fujitsu Limited Identifying An Event Occurrence From Sensor Data Streams
US20140333775A1 (en) * 2013-05-10 2014-11-13 Robert Bosch Gmbh System And Method For Object And Event Identification Using Multiple Cameras
US20150154452A1 (en) * 2010-08-26 2015-06-04 Blast Motion Inc. Video and motion event integration system
US9137308B1 (en) * 2012-01-09 2015-09-15 Google Inc. Method and apparatus for enabling event-based media data capture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US20120042326A1 (en) * 2010-08-16 2012-02-16 Fujitsu Limited Identifying An Event Occurrence From Sensor Data Streams
US20150154452A1 (en) * 2010-08-26 2015-06-04 Blast Motion Inc. Video and motion event integration system
US9137308B1 (en) * 2012-01-09 2015-09-15 Google Inc. Method and apparatus for enabling event-based media data capture
US20140333775A1 (en) * 2013-05-10 2014-11-13 Robert Bosch Gmbh System And Method For Object And Event Identification Using Multiple Cameras

Similar Documents

Publication Publication Date Title
CN104794050B (en) The test method of application program, apparatus and system
US10162742B2 (en) System and method for end to end performance response time measurement based on graphic recognition
US20110304774A1 (en) Contextual tagging of recorded data
US20160171739A1 (en) Augmentation of stop-motion content
CN105955881A (en) Automated test step recording and playback method and apparatus
CN102419727A (en) Automatic testing method and device
CN104794032B (en) A method of realizing intelligent display hardware module automatic test
CN110175609A (en) Interface element detection method, device and equipment
RU2016139156A (en) AUTOMATED INTELLECTUAL DATA COLLECTION AND VERIFICATION
US12161942B2 (en) Videogame telemetry data and game asset tracker for session recordings
CN109388556A (en) A kind of analysis method and device of test process
US20220223067A1 (en) System and methods for learning and training using cognitive linguistic coding in a virtual reality environment
CN109766697A (en) Vulnerability scanning method, storage medium, equipment and system applied to linux system
CN112433948A (en) Simulation test system and method based on network data analysis
CN114546814A (en) Recording playback method, recording playback device and storage medium
US9229846B1 (en) Testing application code changes using a state assertion framework
CN105718353A (en) Testing method, apparatus and system for graphic interface application
Banerjee et al. Object tracking test automation using a robotic arm
CN114120382B (en) Face recognition system testing method and device, electronic equipment and medium
CN110569184B (en) Test method and terminal equipment
WO2017131723A1 (en) Generating a test case for a recorded stream of events
CN110361704B (en) A system and method for playback and analysis of sonar sea test data
WO2015042987A1 (en) Record and replay of operations on graphical objects
US20150039951A1 (en) Apparatus and method for acquiring data of fast fail memory
CN111290945B (en) Compatibility test system and method for vehicle-mounted application equipment to externally connected storage equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16888453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16888453

Country of ref document: EP

Kind code of ref document: A1