[go: up one dir, main page]

WO2016015220A1 - Détection d'anomalie de code exécutable - Google Patents

Détection d'anomalie de code exécutable Download PDF

Info

Publication number
WO2016015220A1
WO2016015220A1 PCT/CN2014/083218 CN2014083218W WO2016015220A1 WO 2016015220 A1 WO2016015220 A1 WO 2016015220A1 CN 2014083218 W CN2014083218 W CN 2014083218W WO 2016015220 A1 WO2016015220 A1 WO 2016015220A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
code
executable
executable code
test code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2014/083218
Other languages
English (en)
Inventor
Yiqun REN
Haiying Liu
Bo Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to US15/326,326 priority Critical patent/US10459830B2/en
Priority to PCT/CN2014/083218 priority patent/WO2016015220A1/fr
Publication of WO2016015220A1 publication Critical patent/WO2016015220A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Definitions

  • tests are runagainst the executable code.
  • the testing of the executable code canbe performed manually by a user.
  • One type of test that can be performedby the user is a monkey test, where the user can randomly operatecontrols of the executable code to determine whether the executable code exhibitssome
  • abnormality e.g.,acrash of the executable code or other abnormal behavior.
  • Figure 1 illustrates a diagram of an example of a system for executable code abnormality detection consistent with the present disclosure.
  • Figure 2 illustrates a diagram of an example computing device consistent with the present disclosure.
  • Figure 3 is a block diagram of an example test system consistent with the present disclosure.
  • Figure 4 illustrates a diagram of an example user interface (Ul) screen consistent with the present disclosure.
  • Figure 5 illustrates an example flow chart of a method for executable code abnormality detectionconsistent with the present disclosure.
  • Figure 6 illustrates an example flow chart of a method for executable code abnormality detection consistent with the present disclosure.
  • Figure 7 illustrates a diagram of an example Ul screen for executable code abnormality detection consistent with the present disclosure.
  • Figure 8 illustrates a diagram of an example Ul screen with an abnormal Ul consistent with the present disclosure.
  • An "abnormality" of executable code can refer to a crash of the executable code, a fault of the executable code, or other unexpected behavior exhibited by the executable code during a test.
  • a monkey test can be relatively time consuming, especially if the monkey test is performed manually by a user.
  • a monkey test is a particular test of a particular type that when executed by a processing resource, performs a test on executable code.
  • a monkey test may be capable of being performed by a test application.
  • a test application for testing executable code includes various random actions that can be performed with respect to features of an executable codethat is under test.
  • Executable code can be an executable file or program that causes a computer to perform indicated tasks according to encoded instructions. Examples of executable code can include applications, operatingsystems, device drivers, firmware, or other machine-readable instructions.
  • Thefeatures of the executable code that may be capable of being controlled in a test application can includeuser interface (Ul) control elements that are activatable (e.g., may be capable of being selected) by a user.
  • Ul control elements that are activatable by a user are referred to as user-activatable control elements.
  • TheUI control elements can be graphical user interface (GUI) control elements, or anyother control elements presented by the executable code for user activation.
  • Theuser-activatablecontrol elements may be capable ofdetermining an execution sequence of theexecutable code.
  • the Ul control elements can include buttons that areclickable by users, text boxes in which users can enter text, and drop-down menus activatable by a user to cause presentation of a menu including various control itemsthat are selectable by a user, for example.
  • the combination of a drop-down menu and a text box is referred to as a combo box. That is, an item capable ofselection in one of two ways: by selection from a drop down menu or by filling the item in a text box directly.
  • a text box is referred to as an edit box.
  • the executable code may be capable of being operated in an unplannedmanner to identify abnormalities.
  • Figure 1 illustrates a diagram of an example of a system 100 for executable code abnormality detection consistent with the present disclosure.
  • the system 100 can include a database 101 accessible by and in communication with a plurality of executable code abnormality detection engines 102.
  • the executable code abnormality detection engines 102 can include a test code engine 103, and a reporting engine 104.
  • the system 100 can include additional or fewer engines than illustrated to perform the various functions described herein, and embodiments are not limited to the example shown in Figure 1.
  • the system 100 can include hardware, firmware, and programming, which in cooperation can form a computing device as discussed in connection with Figure 2.
  • Hardware can be in the form of transistor logic and/or application specific integrated circuitry (ASICs).
  • programming can be in the form of machine-readable and executable instructions (e.g., program instructions or programming) stored in a machine-readable medium (MRM).
  • MRM machine-readable medium
  • the plurality of engines can include a combination of hardware and programming(e.g., program instructions), but at least includes hardware, that is configured to perform particular functions, tasks and/or actions.
  • the engines shown in Figure 1 may be capable of being used to detect user-activatable control elements of an executable code to be tested, generate a first test code based on the user-activatable control elements, and generate a second test code based on ascript for the executable code to be tested.
  • the test code engine 103 can include hardware and/or a combination of hardware and program instructions to detect user-activatable control elements of an executable code to be tested, generate a first test code based on the user-activatable control elements, and generate a second test code based on a script for the executable code to be tested, the first and the second test codes containing instructions executable by a test application to test the executable code.
  • the test code engine may be capable ofgenerating the second test code by randomly inserting additional commands into the script, as discussed further herein.
  • test code engine may be capable ofgenerating the second test code by disordering commands in the script.
  • the test code engine 104 can include hardware and/or a combination of hardware and program instructions to execute the first test code and the second test code using a test application.
  • reporting engine 104 can include hardware and/or a
  • Examples of the present disclosure are not limited to the example engines shown in Figure 1 and one or more engines described may be combined or be a sub-engine of another engine. Further, the engines shown may be remote from one another in a distributed computing environment, cloud computing environment, etc.
  • FIG. 2 illustrates a diagram of an example computing device 208 consistent with the present disclosure.
  • the computing device 208 can utilize hardware, programming (e.g., program instructions), firmware, and/or logic to perform a number of functions described herein.
  • the computing device 208 can be any combination of hardware and program instructions configured to share
  • the hardware can include a processing resource 209 and/or a memory resource 21 1 (e.g., computer or machine-readable medium
  • a processing resource 209 can include one or more processors capable of executing instructions stored by a memory resource 21 1 .
  • the processing resource 209 may be implemented in a single device or distributed across multiple devices.
  • the program instructions e.g., computer- or machine-readable instructions (CRI/MRI)
  • CRI/MRI computer- or machine-readable instructions
  • the memory resource 21 1 can include instructions that when executed by the processing resource 209, may be capable ofexecuting a first test code and a second test code using a test application.
  • the memory resource 21 1 can be a non-transitory machine-readable medium, including one or more memory components capable of storing instructions that may be capable of being executed by processing resource 209 and may be integrated in a single device or distributed across multiple devices. Further, memory resource 21 1 may be fully or partially integrated in the same device as processing resource 209 or it may be separate but accessible to that device and processing resource 209. Thus, it is noted that computing device 208 may be implemented on a participant device, on a server device, on a collection of server devices, and/or a combination of a participant (e.g., user) device and one or more server devices as part of a distributed computing environment, cloud computing environment, etc.
  • the memory resource 21 1 can be in communication with the processing resource 209 via a communication link, e.g., a path, 210.
  • a communication link e.g., a path, 210.
  • communication link 210 may be capable of providing a wired and/or wireless connection between the processing resource 209 and the memory resource 21 1 .
  • the memory resource 21 1 can include a test code module 213, and a reporting module 214.
  • a “module” can include hardware and programming (e.g., program instructions) but includes at least program instructions that can be executed by a processing resource, such as processing resource 209, to perform a particular task, function and/or action.
  • the plurality of modules 213, 214 can be independent modules or sub-modules of other modules.
  • the test code module 213, and the reporting module 214 can be individual modules located on one memory resource or can be located at separate and distinct memory resource locations, such as in a distributed computing environment, cloud computing environment, etc.
  • Each of the plurality of modules 213, 214 can include instructions that when executed by the processing resource 209 may be capable of functioning as a corresponding engine as described in connection with Figure 1.
  • the test code module 213 can include instructions that when executed by the processing resource 209 may be capable of functioning as the test code engine 103 shown in Figure 1.
  • the reporting module 214 can include instructions that when executed by the processing resource 209 may be capable of functioning as the reporting engine 104 shown in Figure 1 .
  • the test code module 213 may be capable ofdetecting user-activatable control elements of an executable code to be tested. Further, the test code module 213 may be capable of generating a first test code based on the detected user-activatable control elements, and generating a second test code based on a script for the executable code to be tested. In some examples, the test code module 213 may be capable of generating the first test code based on at least one test rule, as discussed further herein.
  • the at least one test rule can include priorities assigned to the respective user-activatable control elements and implementation probabilities assigned to the respective user-activatable elements.
  • test code module 213 may be capable ofexecuting the first and the second test codes using a test application.
  • the test application is thereby capable of performing a test of the executable code, and may be capable
  • the test code module 213 may be capable of iterating the generation of the first and the second test codes and the execution of the first and second test codes until a stopping criterion is satisfied.
  • the test codemodule 213 may be capable ofusing a translation string map to detect untranslated strings in the executable code.
  • a translation-string map may be capable ofmapping a string written in a specific language to the string in a different language and vice versa.
  • the translation string-map may be capable of mapping a string written in English to a string written in Japanese.
  • executing the first test code causes performance of actions including automatic operation of elements of the executable code.
  • the reporting module 214 may be capable of, in response to the test code enginedetecting an abnormality in the executable code, save instructions of the first and the second test codes to allow a replay of the first and the second test codes to identify a cause of the detected abnormality.
  • Embodiments are not limited to the example modules shown in Figure 2 and in some cases a number of modules can operate together to function as a particular engine. Further, the engines and/or modules of Figures 1 and 2 can be located in a single system and/or computing device or reside in separate and distinct locations in a distributed network, computing environment, cloud computing environment, etc.
  • Figure 3 is a block diagram of an example test system
  • the test system 320 includes a test application 321 , which may be capable of performing a test by applying a test code322 with respect to anexecutable code under test 323.
  • the test application 321 can include an interfaceto the executable code under test 323 that allows the test application 321 to identifyand operate the Ul control elements of the executable code under test 323.
  • the testapplication 321 may be capable of detectingUI control elementsgenerated bythe executable code undertest 323.
  • the executable code under test 323 can generate one or multiple windows, where each window can include one or multiple Ul control elements that may be capable of beingoperated by the test application 321 .
  • Detecting a Ul control element by the testapplication 321 can include identifying Ul control elements not previously known to auser of the test application 321 .
  • the detected Ul control elements may be capable of beingstored asdetected Ul control elements 324 in a storage medium 325.
  • the test system 320 further includes a test generator 326, which, using the test code engine 103 and/or the test code module 213, may be capable
  • test code refers to a collection of machine- readableprogram instructions that are executable to perform a particulartest.
  • the test code322 can be provided as an input to the test application 321 .
  • the test application 321 and test generator 326 are depicted as being separate in Figure 3, it is noted that thetest generator 326may be capable of being part of the test application 321 in other examples.
  • test rule information 327 is used in generating the test code322 based on the detected Ul control elements 324, and the test generator 326 may be capable ofgenerating a test code that includes random actions to be performed in the test application.
  • test rule information 327 specifies the actions that are to
  • the test ruleinformation 327 can include one or multiple test rules.
  • Each test rule may be capable of specifying aspecific action or type of action that is to be taken, and the corresponding relativefrequency of operation.
  • the relative frequency of operation of each test rule can be expressed as a percentage or as anothervalue.
  • a first test rule can specify that a first action is to be performedwith 90
  • a second rule can specify that a second action is to
  • the first action can involve a first task or somecombination of tasks to be performed with respect to one or multiple Ul controlelements.
  • the second action can specify a second task or some combination oftasks to be performed with respect to another one or multiple U l control elements.
  • the testrule information 327 may be capable of being modified, such as by adding more test rules foradditionally identified Ul control elements, or modifying existing test rules to accountfor the additionally identified Ul control elements.
  • test rule consistent with the present disclosure, which may be capable of being used to generate a first test code.
  • the test rule set forth below specifies both probabilities and priorities associated with different types of Ul controlelements:
  • a higher priority number indicates a higher priority associated with the particular type of U l control element relative to the priority associated with other Ul control elements.
  • the priority of "9" is the highest priority and is associated with the TextBox control element type.
  • the priority of "1 " is the lowest priority and is associated with the button, and button text in the cancel and help button Ul control element type.
  • the priority associated with the TextBox Ul control element type is higher than the priority associated with the button and button text Ul control element type.
  • the first column of the foregoing table specifies the implementation probability for the respective Ul control element type.
  • the second column specifies the corresponding priority.
  • the third column identifies the Ul control element type.
  • the first row of the foregoing table relates to a first group of buttons, including an "OK” button, a "Submit” button, an "Add to Cart” button, among other buttons.
  • the last row of the foregoing table relates to a second group of buttons, including the "Cancel” button, the "Help” button, and so forth.
  • the implementation probability in the first row is different from the implementation probability in the last row for the different groups of buttons.
  • the other rows of the foregoing table relateto other types of control elements.
  • test code for a specific window such as the window of a web page or other type of window
  • process of generating test code for a specific window can be repeated for each window of an executable code under test 323.
  • test rule(s) in the test rule information 327 may be capable of being expressed in a non-programformat. In other words, each test rule is not expressed in a program code, such as ablock of program instructions.
  • the test rules of the test rule information 327 may be capable of beingin the form of a list of parameters and respective assigned values, where theparameters and respective assigned values may be capable of indicating corresponding actions to beperformed with respect to the Ul control elements of the executable code under test.
  • the test rules can be in a table,
  • test generator 326 may be capable of
  • test code 322 based on a script for the executable code to be tested.
  • the script can be an ordered list of commands to be executed by the executable code to be tested 323.
  • the script can be an ordered list of commands to be executed by an application.
  • An example is illustrated in Figure 4.
  • a script can be a pre-defined script, when the ordered list of commands are previously defined and written in a programming language executable by a processor.
  • Figure 4 illustrates a diagram of an example Ul screen consistent with the present disclosure.
  • a script for the page can require that the user enter their name in a name field 433, enter their email in an email field 434, re-enter their email in a second email field 435, enter their mobile number in a mobile phone number field 436, enter a new
  • the test generator 326 may be capable of generating test code 322 to randomly insert additional steps into these pre-defined scripts. For example, the test generator 326 may be capable of randomly inserting a step to visit the link "conditions of use” 441 before completing the information in fields 433-436. In another example, the test generator 326 may be capable of inserting a step to visit the link "learn more" 442 before completing the information in fields 437 and 438.
  • the test generator 326 may be capable ofgenerating a test code 322 by randomly inserting additional steps into a pre-defined script.
  • the test generator 326 may be capable of randomly inserting a step to open the "Conditions of Use” link 440, the "Privacy Notice” link 441 , or the "Learn more” link 442, into the script for registering a user. If, in response to opening links 440, 441 , and/or 442, an additional window opens on the Ul of the executable code under test, then the test code322 may be capable of including instructions to close the additional window.
  • test code 322 may be capable of including instructions to revert to the prior URL.
  • the test application 321 may be capable of emulating the behavior of a user additionally clicking on links on the user-registry web page 432.
  • the test generator 326 using the test code engine 103 and/or the test code module 213, may be capable of generating a test code 322 by disordering the ordered list of commands in the script.
  • the test code 322 may be capable of including instructions to complete fields 437 and 438, then field 436.
  • the test code 322 may be capable of including instructions to click the "create account” link 439 before fields 434 and 435.
  • the test application 321 may be capable of emulating the behavior of a user not completing processes in the correct order.
  • the test application 321 may be capable of using the test code322 to test the executable code under test 323.
  • the test code322 may be capable ofspecifying tasks ofthe test that include activation of Ul control elements of the executable codeunder test 323 in a random or ad-hoc manner.
  • the executable code under test 323 may be capable of providing an output, which may be capable of beingmonitored by the test application 321 to determine whether the executable codeunder test 323 is exhibiting expected or abnormal behavior.
  • the test application 321 may be capable ofrecording test instructions (such as instructions of the test code 322) that relate toactions of the test that caused the abnormality. For example, if the test performed based on the test code322 produces an abnormality of the executable code under test 323, then the test code322 canbe copied to a persistent file (e.g., 328) for later access to replay the test.
  • the recorded testinstructions can be stored in a persistent file 328 in the storage medium 325.
  • therecorded test instructions can include a subset (e.g., less than all) or all of the instructionsin the test code322. The recorded test instructions can be used later to replay atest that caused the abnormality of the executable code under test323.
  • test code322 that is created by the test generator326 may be capable of being stored in a temporary file. If the test
  • test code322 produces an expected behavior of the executable code under test 323, then thetemporary file including the test code322 can be discarded.
  • test generator 326, test code322, executable code under test 323,and test application 321 can be machine-readable instructions executable on one ormultiple processors 329.
  • the processor(s) 329 can be coupled to the
  • test system 320 can include one computer or multiple computers.
  • test code322 may be capable ofbeing generated in a first computer, and sent to asecond computer to test an executable code under test.
  • Figure5 Illustrates an example flow chart of a method 545 for
  • the method 545can include
  • the method 545 can include generating a second test code based on a script for the executable code to be tested, using the test code engine 103 and/or the test code module 213.
  • the script can be a predefined script, having an ordered list of commands to be executed by the executable code under test. Generating the second test code can include randomly inserting additional commands to the script, as described in relation to Figure 4. Similarly, generating the second test code can include disordering the ordered list of commands in the script, also as described in relation to Figure 4.
  • the method 545 includes executing the first and the second test codes to test theexecutable code under test 323, using the test code engine 104 and/or the test code module 214. Executing the first test code causes the
  • Executing the second test code causes additional steps to be inserted into the pre-defined script, and/or the steps in the pre-defined script to be disordered.
  • the first and the second test codes can be executed by the test application 321 in the alternative, or together. That is, either one of the first or second test codes can be executed by the test application 321 , or both the first and the second test codes can be executed by the test application 321 .
  • a translation string-map can also be used to detect abnormalities in the executable code to be tested, as discussed further in relation to Figure 2.
  • the method545 canfurther include receivingoutput from the executable code under test 323 during the test (not shown in Figure 5).
  • the output can indicate that the executable code under test 323 is either behavingas expected or is exhibiting an abnormality.
  • the method 545 includes saving (at 549) instructionsfrom the executed first test code and the executed second test code to allow replay of the test and to identify a causeof the abnormality, using the reporting engine 105 and/or the reporting module 215.
  • Figure 6 Illustrates an example flow chart of a method 645 for executable code abnormality detectionconsistent withthe present disclosure.
  • the method645 includes starting the test application 321 ( Figure 1 ).
  • Thetest application 321 may be capable of presenting a user with a list of different executable code fortesting. From the list, the user capable of select an executable code to test.
  • the test application 321 may be capable of launching the selected executable code under test.
  • the method 645 caninclude selecting the respective test rule information (for the executable code under test). There can be different test rule information for different executable codesunder test.
  • the test application 321 may be capable of detectingthe Ul control elements of awindow (or windows) of the selected executable code under test.
  • Thedetection process can be a continual process that is performed during thetest of the selected executable code under test. As new windows are opened duringthe test, the test application 321 may be capable ofdetectingthe Ul control elements of thenew windows.
  • the test generator 326 may be capable of generatinga first test code based on theselected test rule information and the detected Ul control elements and a second test code based on a script for the executable code to be tested.
  • the second test code may be capable of being generated by inserting additional commands into the script at 659, and disordering commands in the script at 660.
  • the first test codeand/or the second test code may be capable of beingused by the test application 321 to perform a test of the selectedexecutable code under test.
  • the various random operations of the test may be capable ofbeing stored into a temporary file at 663.
  • the instructions of the first test codeand/or the second test code may be capable of being recorded into the temporary file.
  • the test application 321 may be capable ofperforming a test using the first test code and the second test code (e.g., executing the first and the second test codes to test the executable code under test 323), as discussed in relation to Figures 1 -5.
  • the method 645 can further include receiving output from the executable code under test 323 during the test (not shown in Figure 6).
  • the test application 321 can receive output from the executable code under test 323 during the test.
  • the output may be capable of indicating that the executable code under test 323 is either behaving as expected or is exhibiting an abnormality.
  • the method 645 includes recording instructions from the executed first test code and the executed second test code to allow replay of the test and to identify a cause of the abnormality.
  • the test application 321 may be capable of convertingtherecorded operations in the temporary file into a formal persistent test file at 665.
  • the testapplication 321 may be capable of notifying a user
  • the notification can include anidentification of the formal persistent test file such as a pathname or uniform resource locator (URL) of the formal persistent test file.
  • the user may be capable of thenusing the formal persistent test file to replay the test to allow identification ofa cause of the detected abnormality.
  • the method 645 can include determining if a stopping criterion has been satisfied at 667.
  • a stopping criterion can specify that the test is to perform a specified number (e.g. ,500, 1000,5000, etc.) of steps.
  • the stopping criterion can specify that the test is to be performed for a specified amount of time. If the stopping criterion issatisfied (e.g., the specified number of steps has been performed or the specifiedamount of time has passed), the test application 321 stops the test at 668, and discards the temporary file at 669.
  • the method 645 continues; for instance, if a newwindow of the selected executable code under test is opened, the Ul controlelements of the new window are identified at 658, and a further test codemay be capable of beinggenerated at 661 for use in further phasesof the test.
  • Figure 7 illustraterates a diagram of an example Ul screen 780 for
  • Ul screen 780 hasvarious Ul control elements of an executable code under test, including a Usernametext box 781 (in which a user name can be entered), a
  • the Ul screen 780 can be a Login Ul screen in which a user may be capable oflogging into the executable code under test. In other examples, different Ulscreens may be capable of being used.
  • generating a first test code can include detecting all of the controlelements in the respective window.
  • the detected control elements may be capable of beingsorted according to the specified priorities of the control elements, such as thepriorities listed in the rule set forth above.
  • the sorted control elements may be capable of beingprovidedin a control list.
  • the sorting can be in descending order, for example.
  • Generating the first test code can include iterating through eachcontrol element in the control list, in an order of the sorting. As long as there is afurther control element in the control list to process, the first test code generation
  • Thefirst test code generation process determines whether the generatedrandom number is less than or equal to the implementation probability (multiplied by100) that corresponds to the current control element that is being processed. If so, arespective test instruction is generated for the current control element andadded to a temporary script file (which corresponds to the first test code that is beinggenerated). However, if the random number is not less than or equal to theimplementation probability (multiplied by 100), a test instruction is not generated.
  • a test instruction is programming that instructs the test application, using the test code engine 103 and/or the test code module 213, to perform a test.
  • the first test code generation process iterates to thenext control element in the control list. If the end of the control list is reached, thefirst test code generation process returns.
  • the output of the first test code generation process is the temporary script file (including various test instructions generated at for respective control elements) that provides the first test code that may be capable of
  • a user may be capable of specifyingfor which of multipleabnormalities are to be monitored, and may be capable of specifying respective differentactions to take to handle the corresponding abnormal statuses.
  • a firstabnormal status can be a crash of the executable code under test. The actionspecified to handle such a crash can be to stop the test, and to collect usefulinformation in a dump file.
  • Another abnormality can be an error
  • a recovery operation may be capable of being run, and the error may be capable of being ignored following successfulperformance of the recovery operation.
  • the abnormality can include a globalization defect such as an untranslated string, a wrong data format, or a truncation issue.
  • an untranslated string may be capable of being detected using a translation string-map.
  • a wrong data format may be capable of being identified based on a pre-defined data-format table.
  • a truncation issue is an abnormality in the Ul of the executable code under test that erroneously truncates a portion of a string of text.
  • the abnormality can include an abnormal Ul.
  • components of the Ul may incorrectly overlap, as illustrated in Figure 8.
  • Figure 8 illustrates a diagram of an example Ul screen 880 with an abnormal Ul consistent with the present disclosure.
  • the OK button 883 overlaps the airplane illustration 886, indicating an abnormal Ul.
  • An abnormal status rule is a rule that provides information used in identifying at least one abnormality.
  • the abnormal status rule in the table above specifies four abnormal cases.
  • the four columns of the table containing the abnormal status rule includes a Targetcolumn (which identifies the element of the executable code under test 323).
  • the second column identifies a property of the target element
  • the thirdcolumn specifies a value of the property
  • the fourth column includes a commentexplaining the respective abnormal status.
  • the target is a process(which is a process of the executable code under test 323).
  • the test application 321 may be capable of checking the abnormalstatus rule to determine whether an abnormality is present. For example, the testapplication 321 may check whether the PID of the process of the executable code under test 323 has disappeared (e.g., has the value of
  • the test application 321 may be capable of determining whether a titleof a popup window (which has popped up due to execution of the executable codeunder test 323) is the text " ⁇ Process Name>" and the content of the popup windowcontains the text " ⁇ Process Name> has stopped working" (which correspond to thesecond row of the table containing the abnormal status rule).
  • the foregoing match indicates that an abnormality has occurred.
  • test application 321 may be capable of determining whether a titleof a popup window (which has popped up due to execution of the executable codeunder test 323) is the text "Unexpected Error" (which corresponds to the third row ofthe table containing the abnormal status rule). The foregoing match indicates that anabnormality has occurred.
  • the fourth abnormal case (corresponding to the last row of the tablecontaining the abnormal status rule) relates to a web browser not being able todisplay a webpage, such as due to a uniform resource locator (URL) not beingavailable.
  • URL uniform resource locator
  • logic is an alternative or additional processing resource to perform a particular action and/or function, etc., described herein, which includes hardware, e.g., various forms of transistor logic, application specific integrated circuits (ASICs), etc., as opposed to computer executable instructions, e.g., firmware, etc., stored in memory and executable by a processor.
  • hardware e.g., various forms of transistor logic, application specific integrated circuits (ASICs), etc.
  • ASICs application specific integrated circuits

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

L'invention concerne la détection d'anomalie de code exécutable qui peut consister en la détection d'éléments de contrôle activables par l'utilisateur d'un code exécutable à tester, en la génération d'un premier code de test en fonction des éléments de contrôle activables par l'utilisateur détectés, et en la génération d'un second code de test en fonction d'un script pour le code exécutable à tester, le premier et le second code de test contenant des instructions exécutables par une application de test pour tester le code exécutable. En outre, la détection d'anomalie de code exécutable peut consister en l'exécution du premier code de test et du second code de test à l'aide de l'application de test, et en réponse à la détection d'une anomalie par l'application de test, en la sauvegarde des instructions du premier code de test exécuté et du second code de test exécuté dans un fichier persistant pour permettre la réexécution de l'anomalie.
PCT/CN2014/083218 2014-07-29 2014-07-29 Détection d'anomalie de code exécutable Ceased WO2016015220A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/326,326 US10459830B2 (en) 2014-07-29 2014-07-29 Executable code abnormality detection
PCT/CN2014/083218 WO2016015220A1 (fr) 2014-07-29 2014-07-29 Détection d'anomalie de code exécutable

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/083218 WO2016015220A1 (fr) 2014-07-29 2014-07-29 Détection d'anomalie de code exécutable

Publications (1)

Publication Number Publication Date
WO2016015220A1 true WO2016015220A1 (fr) 2016-02-04

Family

ID=55216587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/083218 Ceased WO2016015220A1 (fr) 2014-07-29 2014-07-29 Détection d'anomalie de code exécutable

Country Status (2)

Country Link
US (1) US10459830B2 (fr)
WO (1) WO2016015220A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107908554A (zh) * 2017-11-06 2018-04-13 珠海金山网络游戏科技有限公司 一种手机游戏的动作捕捉、录制和重演的方法和系统
WO2018233037A1 (fr) * 2017-06-20 2018-12-27 平安科技(深圳)有限公司 Procédé de test d'intégration de base de données, dispositif, serveur et support d'informations
CN109359027A (zh) * 2018-08-15 2019-02-19 中国平安人寿保险股份有限公司 Monkey测试方法、装置、电子设备及计算机可读存储介质
CN111078532A (zh) * 2019-11-25 2020-04-28 北京云测信息技术有限公司 一种终端设备的测试方法、装置及系统

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10459830B2 (en) * 2014-07-29 2019-10-29 Micro Focus Llc Executable code abnormality detection
GB201804904D0 (en) * 2018-03-27 2018-05-09 Palantir Technologies Inc Code correction
CN110389878B (zh) * 2019-07-08 2023-10-27 东软集团股份有限公司 一种程序监测方法、装置、片上系统及存储介质
US11442848B1 (en) * 2020-06-18 2022-09-13 Appceler8, LLC System and method for automated patch compatibility of applications
CN112882933B (zh) * 2021-02-09 2024-08-16 京东科技信息技术有限公司 脚本录制方法、装置、设备及存储介质
EP4650967A1 (fr) * 2024-05-15 2025-11-19 Melexis Technologies SA Circuit électronique avec détection d'anomalie de micrologiciel
CN118227511A (zh) * 2024-05-23 2024-06-21 永联科技(常熟)有限公司 一种代码异常检测方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1983209A (zh) * 2005-12-14 2007-06-20 中兴通讯股份有限公司 一种软件单元测试自动化系统及其方法
US20070234121A1 (en) * 2006-03-31 2007-10-04 Sap Ag Method and system for automated testing of a graphic-based programming tool
US20080127045A1 (en) * 2006-09-27 2008-05-29 David Pratt Multiple-developer architecture for facilitating the localization of software applications
CN103365772A (zh) * 2012-04-06 2013-10-23 株式会社日立制作所 软件测试自动评价装置以及方法
WO2014117320A1 (fr) * 2013-01-29 2014-08-07 Hewlett-Packard Development Company, L.P. Génération de code de test pour tester un code exécutable

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6378088B1 (en) * 1998-07-14 2002-04-23 Discreet Logic Inc. Automated test generator
US20060265475A9 (en) * 2001-03-19 2006-11-23 Thomas Mayberry Testing web services as components
US7810071B2 (en) * 2006-07-18 2010-10-05 Sap Ag Automated error analysis
US9098635B2 (en) * 2008-06-20 2015-08-04 Cadence Design Systems, Inc. Method and system for testing and analyzing user interfaces
US8423962B2 (en) 2009-10-08 2013-04-16 International Business Machines Corporation Automated test execution plan generation
US8627295B2 (en) * 2009-10-09 2014-01-07 General Electric Company Methods and apparatus for testing user interfaces
US8904358B1 (en) * 2010-06-08 2014-12-02 Cadence Design Systems, Inc. Methods, systems, and articles of manufacture for synchronizing software verification flows
AU2012100128A4 (en) 2011-02-22 2012-03-08 Zensar Technologies Ltd A computer implemented system and method for indexing and optionally annotating use cases and generating test scenarios therefrom
US9117028B2 (en) 2011-12-15 2015-08-25 The Boeing Company Automated framework for dynamically creating test scripts for software testing
US9037913B2 (en) * 2012-04-30 2015-05-19 Microsoft Technology Licensing, Llc Dynamic event generation for user interface control
US20130339930A1 (en) * 2012-06-18 2013-12-19 South Dakota Board Of Regents Model-based test code generation for software testing
US8918760B2 (en) 2012-12-07 2014-12-23 Sugarcrm Inc. Test script generation for application image validation
US11003570B2 (en) 2014-04-30 2021-05-11 Micro Focus Llc Performing a mirror test for localization testing
US10459830B2 (en) * 2014-07-29 2019-10-29 Micro Focus Llc Executable code abnormality detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1983209A (zh) * 2005-12-14 2007-06-20 中兴通讯股份有限公司 一种软件单元测试自动化系统及其方法
US20070234121A1 (en) * 2006-03-31 2007-10-04 Sap Ag Method and system for automated testing of a graphic-based programming tool
US20080127045A1 (en) * 2006-09-27 2008-05-29 David Pratt Multiple-developer architecture for facilitating the localization of software applications
CN103365772A (zh) * 2012-04-06 2013-10-23 株式会社日立制作所 软件测试自动评价装置以及方法
WO2014117320A1 (fr) * 2013-01-29 2014-08-07 Hewlett-Packard Development Company, L.P. Génération de code de test pour tester un code exécutable

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018233037A1 (fr) * 2017-06-20 2018-12-27 平安科技(深圳)有限公司 Procédé de test d'intégration de base de données, dispositif, serveur et support d'informations
CN107908554A (zh) * 2017-11-06 2018-04-13 珠海金山网络游戏科技有限公司 一种手机游戏的动作捕捉、录制和重演的方法和系统
CN109359027A (zh) * 2018-08-15 2019-02-19 中国平安人寿保险股份有限公司 Monkey测试方法、装置、电子设备及计算机可读存储介质
CN109359027B (zh) * 2018-08-15 2023-06-20 中国平安人寿保险股份有限公司 Monkey测试方法、装置、电子设备及计算机可读存储介质
CN111078532A (zh) * 2019-11-25 2020-04-28 北京云测信息技术有限公司 一种终端设备的测试方法、装置及系统

Also Published As

Publication number Publication date
US20170206155A1 (en) 2017-07-20
US10459830B2 (en) 2019-10-29

Similar Documents

Publication Publication Date Title
US10459830B2 (en) Executable code abnormality detection
US12153512B2 (en) System and method for automated intelligent mobile application testing
US20240037020A1 (en) System and Method for Automated Software Testing
AU2017258963B2 (en) Simultaneous multi-platform testing
US11693762B2 (en) User interface test coverage
US9710366B2 (en) Generating test code to test executable code
US7810070B2 (en) System and method for software testing
CN104123219B (zh) 测试软件的方法和设备
EP3333712B1 (fr) Tests multi-plateforme simultanés
CN108595343A (zh) 应用程序的测试方法及装置
US10437717B2 (en) Defect reporting in application testing
US10719384B2 (en) Determining relationships between components in a computing environment to facilitate root-cause analysis
US10365995B2 (en) Composing future application tests including test action data
US20160275000A1 (en) System and method of automated application screen flow generation for detecting aberration in mobile application
US8578353B2 (en) Tool for analyzing siebel escripts
US8397114B2 (en) Automated regression testing intermediary
CN105653455B (zh) 一种程序漏洞的检测方法及检测系统
CN107430590A (zh) 数据比较
EP3091453A1 (fr) Conception d'un test de longévité pour une télévision intelligente
US20160283355A1 (en) Identifying a configuration element value as a potential cause of a testing operation failure
US11886330B1 (en) System, method, and computer program for generating a context visualization during test automation
US12174723B2 (en) Approach for analysis of logs from a complex physical equipment
Ahmed Test automation for graphical user interfaces: A review
Gao et al. Selecting test cases by cluster analysis of GUI states

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14898660

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15326326

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14898660

Country of ref document: EP

Kind code of ref document: A1