[go: up one dir, main page]

USRE46849E1 - Lifecycle management of automated testing - Google Patents

Lifecycle management of automated testing Download PDF

Info

Publication number
USRE46849E1
USRE46849E1 US14/801,025 US201514801025A USRE46849E US RE46849 E1 USRE46849 E1 US RE46849E1 US 201514801025 A US201514801025 A US 201514801025A US RE46849 E USRE46849 E US RE46849E
Authority
US
United States
Prior art keywords
automated testing
test
reusable
scripts
test scripts
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/801,025
Inventor
Pandiyan Adiyapatham
Vivek Devarajan
Sfoorti Singhania
Rooth Joshwa
Mukil Krishna M.
Shashank Murthy S.N.
Shashank Shripad Welankar
Upasana Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wipro Ltd
Original Assignee
Wipro Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wipro Ltd filed Critical Wipro Ltd
Priority to US14/801,025 priority Critical patent/USRE46849E1/en
Assigned to WIPRO LIMITED reassignment WIPRO LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WELANKAR, SHASHANK SHRIPAD, GUPTA, UPASANA, KRISHNA M., MUKIL, ADIYAPATHAM, PANDIYAN, DEVARAJAN, VIVEK, JOSHWA, ROOTH, MURTHY S.N., SHASHANK, SINGHANIA, SFOORTI
Application granted granted Critical
Publication of USRE46849E1 publication Critical patent/USRE46849E1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management

Definitions

  • Embodiments of the present invention relate to the field of automated testing. More particularly, embodiments of the present invention relate to lifecycle management of automated testing.
  • AUT application under test
  • results may be compared with their expected outcomes.
  • automated testing involves automating a manual process already in place that uses a formalized testing process.
  • Record and playback features of conventional automated testing tools e.g., HP QTP®, IBM RFT®, etc.
  • HP QTP® HP QTP®
  • IBM RFT® IBM RFT®
  • test scripts may need to be updated or rewritten from scratch, thus ensuing in high maintenance costs.
  • maintenance of the test scripts it has become harder to acquire test automation experts due to growing demands and increasing complexities in applications in general.
  • test automation projects for an organization or company spread across different geographical locations, business units, and/or various domains it has become even more difficult to manage the test automation projects using the conventional automated testing tools.
  • a method for lifecycle management of automated testing comprises processing multiple manual test cases for an application under test, associating a set of reusable test scripts to the manual test cases, where the set of reusable test scripts is selected from a library of reusable test scripts, and executing the set of reusable test scripts for the application under test using an automated testing tool associated with the set of reusable test scripts.
  • a method for lifecycle management of automated testing comprises presenting a guideline to generate multiple manual test cases for an application under test, accessing a library of reusable test scripts for an automated testing tool to select a set of reusable test scripts which correspond to the manual test cases from the library when the automated testing tool is selected from a number of licensed automated testing tools, and setting respective parameters for the set of reusable test scripts, and executing the set of reusable test scripts for the application under test using the automated testing tool.
  • FIG. 1 is a block diagram which illustrates an exemplary lifecycle management of automated testing, according to one embodiment.
  • FIG. 2 is an exemplary graphical user interface of the requirement module of FIG. 1 , according to one embodiment.
  • FIG. 3 is an exemplary graphical user interface for the analysis module of FIG. 1 , according to one embodiment.
  • FIG. 4 illustrates exemplary libraries of reusable test scripts associated with the design module of FIG. 1 , according to one embodiment.
  • FIG. 5 illustrates an exemplary process for creating new test cases, according to one embodiment.
  • FIG. 6 illustrates an exemplary graphical user interface for the development module of FIG. 1 , according to one embodiment.
  • FIG. 7 illustrates an exemplary view of a return on investment (ROI) report, according to one embodiment.
  • ROI return on investment
  • FIG. 8 illustrates an exemplary lifecycle management system for automated testing, according to one embodiment.
  • FIG. 9 is a process flow chart of an exemplary method for lifecycle management of automated testing, according to one embodiment.
  • FIG. 10 is a process flow chart of another exemplary method for lifecycle management of automated testing, according to one embodiment.
  • Embodiments of the present invention include test automation lifecycle management features and a guided engineering technique at each stage of test automation using reusable test scripts.
  • the systems and methods maximize return on investment, ensure cross project reusability, empower non-automation experts to create automation test suites, ensure uniformity in automation approach across the organization, and/or provide a set of guidelines and best practices for the test automation.
  • FIG. 1 illustrates an exemplary system 100 for lifecycle management of automated testing, according to one embodiment.
  • the system 100 comprises an administration module 102 , a report module 104 , a project management module 106 , reusable test scripts 108 , a requirement module 110 , an analysis module 112 , a design module 114 , a development module 116 , and an execution module 118 .
  • the administration module 102 may be used to perform administrative functions. For example, it can be used to maintain records for employees, control the scheduling of automated testing and resources, and/or produce management and operational reports.
  • the report module 104 may be used to report the quality of ongoing or completed automated testing projects and/or the qualities of application under tests (AUT).
  • the project management module 106 may be used to centrally allocate available resources (e.g., experts, licensed automated testing tools, etc.) to one or more automated testing projects and manage their progresses.
  • the reusable test scripts 108 may include component codes which can be readily used or need some modifications to create an automated test scenario of multiple test cases. It is appreciated that a test case describes a test that needs to be run on the application under test to verify that the application under test runs as expected.
  • the system 100 also comes with in-built guidance at every stage of automation which is realized by the requirement module 110 , the analysis module 112 , the design module 114 , the development module 116 , and the execution module 118 .
  • the guideline may offer the best practices of test automation.
  • the requirement module 110 may be used to capture various types of technical requirements needed for test automation.
  • the analysis module 112 may be used to analyze a user's responses to the technical requirements from the requirement module 110 .
  • the design module 114 may be used to maintain reusable test scripts used for automated testing.
  • the development module 116 may be used to build a test scenario by associating some of the reusable test scripts 108 to test cases used to build the test scenario.
  • the execution module 118 may be used to execute the test scenario using a licensed automated testing tool (e.g., HP QTP®, IBM RFT®, etc.).
  • a licensed automated testing tool e.g., HP QTP®, IBM RFT®, etc.
  • a method for lifecycle management of automated testing comprises presenting a guideline across the automated testing modules which includes the requirement module 110 , the analysis module 112 , the design module 114 , the development module 116 , and the execution module 118 , to generate multiple manual test cases for an application under test.
  • the method also comprises accessing a library of reusable test scripts for an automated testing tool to select a set of reusable test scripts which correspond to the manual test cases when the automated testing tool is selected from a number of licensed automated testing tools.
  • the method further comprises setting respective parameters for the set of reusable test scripts, and executing the set of reusable test scripts for the application under test using the automated testing tool.
  • FIG. 2 is an exemplary graphical user interface (GUI) 200 of the requirement module 110 of FIG. 1 , according to one embodiment.
  • GUI 200 may be used to capture test automation requirements of an application under test.
  • the requirement module 110 may solicit a user's input to technical requirements 202 of the application under test.
  • the technical requirements 202 may comprise an environment 204 , a standard 206 , a time/resource 208 , an integration 210 , and/or others 212 .
  • One or more questionnaire items in the environment 204 solicit the user's responses on the environment (e.g., platform support requirements 214 , OS/browser/application type requirements 216 , lingual support requirements 218 , etc.) in which the test is expected to run.
  • the standard 206 solicits the user's responses on the requirements for maintenance (e.g., maintain from multiple places) and execution (e.g., execute locally or from a remote location).
  • the time/resources 208 may solicit the user's responses on available licenses, skill levels of people, and/or organization of the test automation project.
  • the integration 210 may solicit the user's responses on the project's integration with various entities. It is appreciated that there could me more items (e.g., others 212 ) on the technical requirements.
  • a test case generation 220 may be used to generate new manual test cases or import from existing manual test cases.
  • FIG. 3 is an exemplary graphical user interface (GUI) 300 of the analysis module 112 of FIG. 1 , according to one embodiment.
  • the GUI 300 may be used to analyze the user's responses to the requirement module 110 of FIG. 1 . This is done by presenting a pre-built check list of automation analysis parameters that enables creation of a suitable design, development, and execution strategy for the automation project.
  • the GUI 300 may present a menu which includes a general 302 , analyze test cases 304 , analyze integration requirements 306 , analyze application 308 , analyze time/resources 310 , and/or analyze reporting/execution-error log requirements 312 .
  • FIG. 4 illustrates exemplary libraries 400 of reusable test scripts associated with the design module 114 of FIG. 1 , according to one embodiment.
  • the design module 114 may be used to create the libraries where the components or reusable test transcripts may be created using one or more automating tools (HP QTP®, IBM RFT®, etc.). Then, the contents of the entire libraries 400 can be exported into a single file, or the components may be exported into individual files.
  • the components or reusable test scripts are uploaded to the system 100 of FIG. 1 , their codes may be checked for errors.
  • the libraries or the reusability design may be created from the script.
  • the libraries 400 may be in multiple levels or layers.
  • the libraries 400 may comprise five levels—core framework functions 404 , standard window functions 406 , windows specific functions 408 , entity functions 410 , and application level functions 412 .
  • the application levels functions 412 , the entity functions 410 , the windows specific functions 408 , the standard window functions 406 , and the core framework functions 404 include reusable components or test scripts that depend on each other.
  • manageusers 414 of the application level functions 412 includes createuser 416 of the entity functions 410 .
  • the createuser 416 of the entity functions 410 includes login 418 and user 420 of the windows specific functions 408 .
  • the login 418 of the windows specific functions 408 includes login_invoke 422 , login_setvalues 424 , and login_accept 426 .
  • the login_invoke 422 , the login_setvalues 424 , and the login_accept 426 of the standard window functions 406 include web_launchURL 428 , web_settext 430 and 432 , and web_click 434 , respectively.
  • the web_launchURL 428 of the standard windows functions 406 depend on web_launchURL 436 , the web_settext 430 and 432 to web_settext 438 , and the web_click 434 to web_click 440 of the core framework functions 404 .
  • the user may identify a layer or level where modification to a particular reusable test script needs to be made. For instance, if more fundamental changes to test scripts need to be made, reusable test scripts at the core frame work functions 404 may be modified. Additionally, the reusable test scripts may be divided into two sections, one that does not require any GUI application (e.g., business flow testing operations) and the other that require one or more GUI applications (e.g., functions that interact with GUI). This design approach may minimize changes or modifications to the reusable test scripts for automated testing projects, thus reducing the maintenance efforts and costs of the testing.
  • GUI application e.g., business flow testing operations
  • GUI applications e.g., functions that interact with GUI
  • FIG. 5 illustrates an exemplary process 500 for creating new test cases, according to one embodiment.
  • a test case manager 502 may be a graphical user interface for creating one or more test cases (e.g., manual test cases).
  • a test repository 504 may be used to store the test cases. From a drop down menu, the user may click “create new test cases 506 .
  • a dialog window may be displayed requiring the user's input, such as name 508 (e.g., “login”), ID 510 (e.g., “tc001”), details 512 (e.g., “test case to check login functionality”, expected behaviour 514 (e.g., “valid user should be able to login”), complexity 516 (e.g., “simple”), priority 518 (e.g., “medium”), and criticality 520 (e.g., “important”).
  • name 508 e.g., “login”
  • ID 510 e.g., “tc001”
  • details 512 e.g., “test case to check login functionality”
  • expected behaviour 514 e.g., “valid user should be able to login”
  • complexity 516 e.g., “simple”
  • priority 518 e.g., “medium”
  • criticality 520 e.g., “important”.
  • FIG. 6 illustrates an exemplary graphical user interface (GUI) 600 for the development module 116 of FIG. 1 , according to one embodiment.
  • GUI 600 is used to create a test scenario 602 based on multiple test cases 604 .
  • Each of the test cases 604 may include several test steps 606 . Then each step 606 may be matched with a reusable test script. Once the process is completed, the test scenario 602 may be ready for execution.
  • the test scenario 602 “notepad_type_close,” includes two test cases, TC 1 and TC 2 .
  • TC 1 is based on two test steps 606 with step numbers 608 TS 1 and TS 2
  • TC 2 is based on six test steps with step numbers 608 TS 1 -TS 6 .
  • Each of the steps is described with a step description 610 , and its expected result 612 is listed as well.
  • a keyword 614 is used to match a particular reusable test script to a test case (e.g., or its test step). Then, param 1 616 and param 2 618 may be used to set parameter values.
  • an edit test step dialog 620 may be used to edit one or more of the test steps 606 .
  • TS 1 of TC 2 is used to associate reusable test script “web_invoke” from the core framework functions of the libraries 400 using a select library 622 and a select function 626 menus. Then, parameter values 624 associated with the test step may be set.
  • FIG. 7 illustrates an exemplary view of a return on investment (ROI) report 714 , according to one embodiment.
  • a ROI report request 702 may be used to process a user's request for the ROI report 714 .
  • the user is request to fill out the user's organization unit 704 , manual effort billing rate 706 associated with the test automation project, start date 708 , end date 710 , and e-mail address 712 .
  • the ROI report 714 is forwarded to the user in response to the ROI report request 702 to inform the user about the usage of the automated testing scripts using different automated testing tools, where the report includes a project name 716 , month 718 , effort saved in hours 720 , manual testing rate 722 , amount saved 724 , amount invested 726 , and return on investment in percentage 728 .
  • FIG. 8 illustrates an exemplary lifecycle management system 800 for automated testing, according to one embodiment.
  • the lifecycle management system 800 allows the distribution of tasks on the basis of expertise of its people 806 .
  • design of the automation workflow e.g., via activities 808 which include study & analyze requirements 814 , architect/design solution 818 , script functions 822 , develop test cases 830 , maintain & execute 834 , and report 840 ) may be assigned more methodically.
  • a chief information office (CIO) 802 may appoint managers 804 (e.g., automation manager 1 —division A, automation manager 2 —division B. automation manager 3 —division C) for multiple test automation tasks. Each manager then assigns tasks for the project to one or more of the people 806 according to their expertise.
  • the task of the study & analyze requirements 814 is assigned to an automation consultant 816 (e.g., using the requirement module 110 and the analysis module 112 of FIG. 1 ), the architect/design solution 818 to an automation architect 820 (e.g., using the design module 114 ), and the script functions 822 to automation engineers 824 (e.g., using the design module 114 & tools 810 ).
  • scripts/test cases/frameworks 828 A may be generated by processing the script functions 822 using automation scripting tool 1 826 A, scripts/test cases/frameworks 828 B using automation scripting tool 2 826 B, and scripts/test cases/frameworks 828 C using automation scripting tool 3 826 C.
  • the lifecycle management system 800 may define ongoing automation projects in a single console in a hierarchical tree view. In addition, it may provide a standard list of automation tasks that can be assigned, tracked, or reassigned by project managers for single or multiple projects. Furthermore, the lifecycle management system 800 may encompass a facility to assign specific users to certain tasks, and define their role requirements and privileges. Moreover, the lifecycle management system 800 may have in-built reports for test automation status and application quality reports which can be viewed any location.
  • FIG. 9 is a process flow chart 900 of an exemplary method for lifecycle management of automated testing, according to one embodiment.
  • operation 902 multiple manual test cases for an application under test are processed.
  • a guideline for generating the multiple test cases may be presented during the operation, where the guideline may include a list of technical requirements for testing the application under test and a list of questionnaires for analyzing a user response to the list of technical requirements.
  • the manual test cases may be imported from existing manual test cases, or they may be newly created.
  • a set of reusable test scripts is associated to the manual test cases, where the set of reusable test scripts is selected from a library of reusable test scripts.
  • the library of reusable test scripts for the automated testing tool may be accessed when the automated testing tool is selected from a number of licensed automated testing tools.
  • respective parameters for the set of reusable test scripts may be set.
  • the library of reusable test scripts may be in multiple levels for easy maintenance of the reusable test scripts, where the multiple levels may include a field level, a window level, a module level, and an application functionality level.
  • the set of reusable test scripts (e.g., or a test scenario) for the application under test is executed using an automated testing tool associated with the set of reusable test script, if the set of reusable test scripts at the field level is generated using the automated testing tool.
  • FIG. 10 is a process flow chart 1000 of another exemplary method for lifecycle management of automated testing, according to one embodiment.
  • a guideline is presented to generate multiple manual test cases for an application under test.
  • a library of reusable test scripts for an automated testing tool is accessed to select a set of reusable test scripts which correspond to the manual test cases when the automated testing tool is selected from a number of licensed automated testing tools.
  • respective parameters for the set of reusable test scripts are set.
  • the set of reusable test scripts is executed for the application under test using the automated testing tool.
  • the various devices, modules, analyzers, generators, etc. described herein may be enabled and operated using hardware circuitry (e.g., CMOS based logic circuitry), firmware, software and/or any combination of hardware, firmware, and/or software (e.g., embodied in a machine readable medium).
  • hardware circuitry e.g., CMOS based logic circuitry
  • firmware e.g., software and/or any combination of hardware, firmware, and/or software
  • the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits (e.g., application specific integrated ASIC circuitry).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Systems and methods for lifecycle management of automated testing are disclosed. In one embodiment, a method includes processing multiple manual test cases for an application under test, associating a set of reusable test scripts to the manual test cases, where the set of reusable test scripts is selected from a library of reusable test scripts, and executing the set of reusable test scripts for the application under test using an automated testing tool associated with the set of reusable test scripts.

Description

This application is a reissue application of U.S. Pat. No. 8,347,147, issued on Jan. 1, 2013, from U.S. patent application Ser. No. 12/399,982, filed on Mar. 9, 2009, all of which are incorporated herein by reference in their entirety.
FIELD OF TECHNOLOGY
Embodiments of the present invention relate to the field of automated testing. More particularly, embodiments of the present invention relate to lifecycle management of automated testing.
BACKGROUND
Software needs to be tested before it is delivered. For example, several features of an application under test (AUT) may be manually executed, and the results may be compared with their expected outcomes. Although the application can be tested manually, the recent trend is moving toward automated testing. Commonly, automated testing involves automating a manual process already in place that uses a formalized testing process. Record and playback features of conventional automated testing tools (e.g., HP QTP®, IBM RFT®, etc.) may capture and record the user's actions or manual testing steps to generated test scripts, which may be used later for automatic testing of the application under test.
However, as an application changes to adopt new features or functionalities, the test scripts may need to be updated or rewritten from scratch, thus ensuing in high maintenance costs. As for the maintenance of the test scripts, it has become harder to acquire test automation experts due to growing demands and increasing complexities in applications in general. Furthermore, as test automation projects for an organization or company spread across different geographical locations, business units, and/or various domains, it has become even more difficult to manage the test automation projects using the conventional automated testing tools.
SUMMARY
Systems and methods of lifecycle management of automated testing are disclosed. In one aspect, a method for lifecycle management of automated testing comprises processing multiple manual test cases for an application under test, associating a set of reusable test scripts to the manual test cases, where the set of reusable test scripts is selected from a library of reusable test scripts, and executing the set of reusable test scripts for the application under test using an automated testing tool associated with the set of reusable test scripts.
In another aspect, a method for lifecycle management of automated testing comprises presenting a guideline to generate multiple manual test cases for an application under test, accessing a library of reusable test scripts for an automated testing tool to select a set of reusable test scripts which correspond to the manual test cases from the library when the automated testing tool is selected from a number of licensed automated testing tools, and setting respective parameters for the set of reusable test scripts, and executing the set of reusable test scripts for the application under test using the automated testing tool.
The methods disclosed herein may be implemented in any means for achieving various aspects, and may be executed in a form of a machine readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein. Other features will be apparent from the accompanying drawings and from the detailed description that follows.
BRIEF DESCRIPTION OF THE DRAWINGS
Example embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
FIG. 1 is a block diagram which illustrates an exemplary lifecycle management of automated testing, according to one embodiment.
FIG. 2 is an exemplary graphical user interface of the requirement module of FIG. 1, according to one embodiment.
FIG. 3 is an exemplary graphical user interface for the analysis module of FIG. 1, according to one embodiment.
FIG. 4 illustrates exemplary libraries of reusable test scripts associated with the design module of FIG. 1, according to one embodiment.
FIG. 5 illustrates an exemplary process for creating new test cases, according to one embodiment.
FIG. 6 illustrates an exemplary graphical user interface for the development module of FIG. 1, according to one embodiment.
FIG. 7 illustrates an exemplary view of a return on investment (ROI) report, according to one embodiment.
FIG. 8 illustrates an exemplary lifecycle management system for automated testing, according to one embodiment.
FIG. 9 is a process flow chart of an exemplary method for lifecycle management of automated testing, according to one embodiment.
FIG. 10 is a process flow chart of another exemplary method for lifecycle management of automated testing, according to one embodiment.
Other features of the present embodiments will be apparent from the accompanying drawings and from the detailed description that follows.
DETAILED DESCRIPTION
Systems and methods of lifecycle management of automated testing are disclosed. Embodiments of the present invention include test automation lifecycle management features and a guided engineering technique at each stage of test automation using reusable test scripts. Thus, the systems and methods maximize return on investment, ensure cross project reusability, empower non-automation experts to create automation test suites, ensure uniformity in automation approach across the organization, and/or provide a set of guidelines and best practices for the test automation.
In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
FIG. 1 illustrates an exemplary system 100 for lifecycle management of automated testing, according to one embodiment. In FIG. 1, the system 100 comprises an administration module 102, a report module 104, a project management module 106, reusable test scripts 108, a requirement module 110, an analysis module 112, a design module 114, a development module 116, and an execution module 118.
The administration module 102 may be used to perform administrative functions. For example, it can be used to maintain records for employees, control the scheduling of automated testing and resources, and/or produce management and operational reports. The report module 104 may be used to report the quality of ongoing or completed automated testing projects and/or the qualities of application under tests (AUT). The project management module 106 may be used to centrally allocate available resources (e.g., experts, licensed automated testing tools, etc.) to one or more automated testing projects and manage their progresses. The reusable test scripts 108 may include component codes which can be readily used or need some modifications to create an automated test scenario of multiple test cases. It is appreciated that a test case describes a test that needs to be run on the application under test to verify that the application under test runs as expected.
The system 100 also comes with in-built guidance at every stage of automation which is realized by the requirement module 110, the analysis module 112, the design module 114, the development module 116, and the execution module 118. The guideline may offer the best practices of test automation. The requirement module 110 may be used to capture various types of technical requirements needed for test automation. The analysis module 112 may be used to analyze a user's responses to the technical requirements from the requirement module 110. The design module 114 may be used to maintain reusable test scripts used for automated testing. The development module 116 may be used to build a test scenario by associating some of the reusable test scripts 108 to test cases used to build the test scenario. The execution module 118 may be used to execute the test scenario using a licensed automated testing tool (e.g., HP QTP®, IBM RFT®, etc.).
In one embodiment, a method for lifecycle management of automated testing comprises presenting a guideline across the automated testing modules which includes the requirement module 110, the analysis module 112, the design module 114, the development module 116, and the execution module 118, to generate multiple manual test cases for an application under test. The method also comprises accessing a library of reusable test scripts for an automated testing tool to select a set of reusable test scripts which correspond to the manual test cases when the automated testing tool is selected from a number of licensed automated testing tools. The method further comprises setting respective parameters for the set of reusable test scripts, and executing the set of reusable test scripts for the application under test using the automated testing tool.
FIG. 2 is an exemplary graphical user interface (GUI) 200 of the requirement module 110 of FIG. 1, according to one embodiment. The GUI 200 may be used to capture test automation requirements of an application under test. In one exemplary implementation, the requirement module 110 may solicit a user's input to technical requirements 202 of the application under test. The technical requirements 202 may comprise an environment 204, a standard 206, a time/resource 208, an integration 210, and/or others 212.
One or more questionnaire items in the environment 204 solicit the user's responses on the environment (e.g., platform support requirements 214, OS/browser/application type requirements 216, lingual support requirements 218, etc.) in which the test is expected to run. The standard 206 solicits the user's responses on the requirements for maintenance (e.g., maintain from multiple places) and execution (e.g., execute locally or from a remote location). The time/resources 208 may solicit the user's responses on available licenses, skill levels of people, and/or organization of the test automation project. The integration 210 may solicit the user's responses on the project's integration with various entities. It is appreciated that there could me more items (e.g., others 212) on the technical requirements. As will be illustrated in detail in FIG. 5, a test case generation 220 may be used to generate new manual test cases or import from existing manual test cases.
FIG. 3 is an exemplary graphical user interface (GUI) 300 of the analysis module 112 of FIG. 1, according to one embodiment. The GUI 300 may be used to analyze the user's responses to the requirement module 110 of FIG. 1. This is done by presenting a pre-built check list of automation analysis parameters that enables creation of a suitable design, development, and execution strategy for the automation project. As illustrated in FIG. 3, the GUI 300 may present a menu which includes a general 302, analyze test cases 304, analyze integration requirements 306, analyze application 308, analyze time/resources 310, and/or analyze reporting/execution-error log requirements 312.
FIG. 4 illustrates exemplary libraries 400 of reusable test scripts associated with the design module 114 of FIG. 1, according to one embodiment. The design module 114 may be used to create the libraries where the components or reusable test transcripts may be created using one or more automating tools (HP QTP®, IBM RFT®, etc.). Then, the contents of the entire libraries 400 can be exported into a single file, or the components may be exported into individual files. When the components or reusable test scripts are uploaded to the system 100 of FIG. 1, their codes may be checked for errors. In an alternative embodiment, as the case base or script comprising functions is uploaded into the system 100, the libraries or the reusability design may be created from the script.
In one exemplary implementation, the libraries 400 may be in multiple levels or layers. For example, the libraries 400 may comprise five levels—core framework functions 404, standard window functions 406, windows specific functions 408, entity functions 410, and application level functions 412. As illustrated in FIG. 4, the application levels functions 412, the entity functions 410, the windows specific functions 408, the standard window functions 406, and the core framework functions 404 include reusable components or test scripts that depend on each other. Thus, manageusers 414 of the application level functions 412 includes createuser 416 of the entity functions 410. In addition, the createuser 416 of the entity functions 410 includes login 418 and user 420 of the windows specific functions 408. Then, the login 418 of the windows specific functions 408 includes login_invoke 422, login_setvalues 424, and login_accept 426. Also, the login_invoke 422, the login_setvalues 424, and the login_accept 426 of the standard window functions 406 include web_launchURL 428, web_settext 430 and 432, and web_click 434, respectively. Furthermore, the web_launchURL 428 of the standard windows functions 406 depend on web_launchURL 436, the web_settext 430 and 432 to web_settext 438, and the web_click 434 to web_click 440 of the core framework functions 404.
With the layered structure of the libraries 400, the user may identify a layer or level where modification to a particular reusable test script needs to be made. For instance, if more fundamental changes to test scripts need to be made, reusable test scripts at the core frame work functions 404 may be modified. Additionally, the reusable test scripts may be divided into two sections, one that does not require any GUI application (e.g., business flow testing operations) and the other that require one or more GUI applications (e.g., functions that interact with GUI). This design approach may minimize changes or modifications to the reusable test scripts for automated testing projects, thus reducing the maintenance efforts and costs of the testing.
FIG. 5 illustrates an exemplary process 500 for creating new test cases, according to one embodiment. In FIG. 5, a test case manager 502 may be a graphical user interface for creating one or more test cases (e.g., manual test cases). A test repository 504 may be used to store the test cases. From a drop down menu, the user may click “create new test cases 506. Then, a dialog window may be displayed requiring the user's input, such as name 508 (e.g., “login”), ID 510 (e.g., “tc001”), details 512 (e.g., “test case to check login functionality”, expected behaviour 514 (e.g., “valid user should be able to login”), complexity 516 (e.g., “simple”), priority 518 (e.g., “medium”), and criticality 520 (e.g., “important”). Once the test case is created, it is stored in a folder of the test repository 504. In an alternative embodiment, one or more test cases may be imported from existing test cases.
FIG. 6 illustrates an exemplary graphical user interface (GUI) 600 for the development module 116 of FIG. 1, according to one embodiment. The GUI 600 is used to create a test scenario 602 based on multiple test cases 604. Each of the test cases 604 may include several test steps 606. Then each step 606 may be matched with a reusable test script. Once the process is completed, the test scenario 602 may be ready for execution.
In FIG. 6, the test scenario 602, “notepad_type_close,” includes two test cases, TC1 and TC2. TC1 is based on two test steps 606 with step numbers 608 TS1 and TS2, whereas TC2 is based on six test steps with step numbers 608 TS1-TS6. Each of the steps is described with a step description 610, and its expected result 612 is listed as well. Furthermore, a keyword 614 is used to match a particular reusable test script to a test case (e.g., or its test step). Then, param 1 616 and param 2 618 may be used to set parameter values. Additionally, an edit test step dialog 620 may be used to edit one or more of the test steps 606. In the dialog box, TS1 of TC2 is used to associate reusable test script “web_invoke” from the core framework functions of the libraries 400 using a select library 622 and a select function 626 menus. Then, parameter values 624 associated with the test step may be set.
FIG. 7 illustrates an exemplary view of a return on investment (ROI) report 714, according to one embodiment. In FIG. 7, a ROI report request 702 may be used to process a user's request for the ROI report 714. For the request 702, the user is request to fill out the user's organization unit 704, manual effort billing rate 706 associated with the test automation project, start date 708, end date 710, and e-mail address 712. Then, the ROI report 714 is forwarded to the user in response to the ROI report request 702 to inform the user about the usage of the automated testing scripts using different automated testing tools, where the report includes a project name 716, month 718, effort saved in hours 720, manual testing rate 722, amount saved 724, amount invested 726, and return on investment in percentage 728.
FIG. 8 illustrates an exemplary lifecycle management system 800 for automated testing, according to one embodiment. The lifecycle management system 800 allows the distribution of tasks on the basis of expertise of its people 806. Thus, design of the automation workflow (e.g., via activities 808 which include study & analyze requirements 814, architect/design solution 818, script functions 822, develop test cases 830, maintain & execute 834, and report 840) may be assigned more methodically.
For example, a chief information office (CIO) 802 may appoint managers 804 (e.g., automation manager 1—division A, automation manager 2—division B. automation manager 3—division C) for multiple test automation tasks. Each manager then assigns tasks for the project to one or more of the people 806 according to their expertise. In FIG. 8, the task of the study & analyze requirements 814 is assigned to an automation consultant 816 (e.g., using the requirement module 110 and the analysis module 112 of FIG. 1), the architect/design solution 818 to an automation architect 820 (e.g., using the design module 114), and the script functions 822 to automation engineers 824 (e.g., using the design module 114 & tools 810). Once the script functions 822 are performed using the tools 810, code 812 may be generated. As illustrated in FIG. 8, scripts/test cases/frameworks 828A may be generated by processing the script functions 822 using automation scripting tool 1 826A, scripts/test cases/frameworks 828B using automation scripting tool 2 826B, and scripts/test cases/frameworks 828C using automation scripting tool 3 826C.
Then, the task of the develop test cases 830 is assigned to a manual tester/business user 832, and a manual tester 836 may be assigned to perform the task of the maintain & execute 834 using an automation scripting tool 838. The same tester 836 or another tester 842 may be assigned to perform a task of the report 840. Thus, the lifecycle management system 800 may define ongoing automation projects in a single console in a hierarchical tree view. In addition, it may provide a standard list of automation tasks that can be assigned, tracked, or reassigned by project managers for single or multiple projects. Furthermore, the lifecycle management system 800 may encompass a facility to assign specific users to certain tasks, and define their role requirements and privileges. Moreover, the lifecycle management system 800 may have in-built reports for test automation status and application quality reports which can be viewed any location.
FIG. 9 is a process flow chart 900 of an exemplary method for lifecycle management of automated testing, according to one embodiment. In operation 902, multiple manual test cases for an application under test are processed. In one embodiment, a guideline for generating the multiple test cases may be presented during the operation, where the guideline may include a list of technical requirements for testing the application under test and a list of questionnaires for analyzing a user response to the list of technical requirements. The manual test cases may be imported from existing manual test cases, or they may be newly created.
In operation 904, a set of reusable test scripts is associated to the manual test cases, where the set of reusable test scripts is selected from a library of reusable test scripts. In one embodiment, the library of reusable test scripts for the automated testing tool may be accessed when the automated testing tool is selected from a number of licensed automated testing tools. In addition, respective parameters for the set of reusable test scripts may be set. Furthermore, the library of reusable test scripts may be in multiple levels for easy maintenance of the reusable test scripts, where the multiple levels may include a field level, a window level, a module level, and an application functionality level.
In operation 906, the set of reusable test scripts (e.g., or a test scenario) for the application under test is executed using an automated testing tool associated with the set of reusable test script, if the set of reusable test scripts at the field level is generated using the automated testing tool.
It is appreciated that the methods disclosed herein may be executed in a form of a machine readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein.
FIG. 10 is a process flow chart 1000 of another exemplary method for lifecycle management of automated testing, according to one embodiment. In operation 1002, a guideline is presented to generate multiple manual test cases for an application under test. In operation 1004, a library of reusable test scripts for an automated testing tool is accessed to select a set of reusable test scripts which correspond to the manual test cases when the automated testing tool is selected from a number of licensed automated testing tools. In operation 1006, respective parameters for the set of reusable test scripts are set. In operation 1008, the set of reusable test scripts is executed for the application under test using the automated testing tool.
It is appreciated that the methods disclosed herein may be executed in a form of a machine readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein.
Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. For example, the various devices, modules, analyzers, generators, etc. described herein may be enabled and operated using hardware circuitry (e.g., CMOS based logic circuitry), firmware, software and/or any combination of hardware, firmware, and/or software (e.g., embodied in a machine readable medium). For example, the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits (e.g., application specific integrated ASIC circuitry).

Claims (25)

What is claimed is:
1. A method for lifecycle management of automated testing, comprising:
processing a plurality of manual test cases for an application under test;
associating a set of reusable test scripts to the plurality of manual test cases, wherein the set of reusable test scripts is selected from a library of reusable test scripts, wherein the library of reusable test scripts is accessed for an automated testing tool when the automated testing tool is selected from a number of licensed automated testing tools;
executing the set of reusable test scripts for the application under test using the automated testing tool associated with the set of reusable test scripts;
displaying automated testing projects which include the automated testing of the application under test; and
displaying a return on investment (ROI) for each of the automated testing projects.
2. The method of claim 1, wherein the processing the plurality of manual test cases comprises presenting a guideline for generating the plurality of manual test cases.
3. The method of claim 2, wherein the guideline comprises a list of technical requirements for testing the application under test.
4. The method of claim 3, wherein the guideline comprises a list of questionnaires for analyzing a user response to the list of technical requirements.
5. The method of claim 1, wherein the associating the set of reusable test scripts further comprises setting respective parameters for the set of reusable test scripts.
6. The method of claim 1, wherein at least one of the plurality of manual test cases is imported from existing manual test cases.
7. The method of claim 1, wherein at least one of the plurality of manual test cases is newly created.
8. The method of claim 1, wherein the automated testing projects which include the automated testing of the application under test are displayed in a hierarchical tree view.
9. The method of claim 8, wherein the lifecycle management of the automated testing comprises displaying a standard list of automation tasks controlled by at least one project manager for the automated testing projects.
10. The method of claim 8, wherein the lifecycle management of the automated testing comprises defining specific users and their role requirements and privileges for the automated testing projects.
11. The method of claim 8, wherein the lifecycle management of the automated testing comprises forwarding a status report for each automated testing project and a quality report for each application under test.
12. A method for lifecycle management of automated testing, comprising:
processing a plurality of manual test cases for an application under test;
associating a set of reusable test scripts to the plurality of manual test cases, wherein the set of reusable test scripts is selected from a library of reusable test scripts, wherein the library of reusable test scripts is in multiple levels for easy maintenance of the reusable test scripts; and
executing the set of reusable test scripts for the application under test using an automated testing tool associated with the set of reusable test scripts.
13. The method of claim 12, wherein the multiple levels comprise a field level, a window level, a module level, and an application functionality level.
14. The method of claim 13, wherein each reusable test script at the field level is generated using the automated testing tool.
15. A method for lifecycle management of automated testing, comprising:
processing a plurality of manual test cases for an application under test;
associating a set of reusable test scripts to the plurality of manual test cases, wherein the set of reusable test scripts is selected from a library of reusable test scripts;
executing the set of reusable test scripts for the application under test using an automated testing tool associated with the set of reusable test scripts;
displaying automated testing projects which include the automated testing of the application under test in a hierarchical tree view; and
displaying a return on investment (ROI) for the each automated testing project.
16. A method for lifecycle management of automated testing, comprising:
presenting a guideline to generate a plurality of manual test cases for an application under test;
accessing a library of reusable test scripts for an automated testing tool to select a set of reusable test scripts which correspond to the plurality of manual test cases when the automated testing tool is selected from a number of licensed automated testing tools, wherein the library of reusable test scripts is in multiple levels for easy maintenance of the reusable test scripts;
setting respective parameters for the set of reusable test scripts; and
executing the set of reusable test scripts for the application under test using the automated testing tool.
17. The method of claim 16, wherein at least one of the plurality of manual test cases is newly created or imported from existing manual test cases.
18. A non-transitory computer readable medium for lifecycle management of automated testing having instructions that, when executed by a computer, cause the computer to perform a method comprising:
processing a plurality of manual test cases for an application under test;
associating a set of reusable test scripts to the plurality of manual test cases, wherein the set of reusable test scripts is selected from a library of reusable test scripts, wherein the library of reusable test scripts is in multiple levels for easy maintenance of the reusable test scripts; and
executing the set of reusable test scripts for the application under test using the automated testing tool associated with the set of reusable test scripts.
19. A system embodied on a non-transitory computer-readable storage medium for lifecycle management of automated testing, the system comprising:
an administration module for performing administrative functions;
a report module for reporting on quality of automated testing projects and application under tests (AUT);
a project management module for centrally allocating available resources to the automated testing projects and managing progress of the automated testing projects;
a set of reusable test scripts comprising component codes used to create an automated test scenario of a plurality of manual test cases, wherein the set of reusable test scripts is selected from a library of reusable test scripts, and wherein the library of reusable test scripts is in multiple levels for easy maintenance of the reusable test scripts;
a requirement module for capturing technical requirements needed for the automated testing;
an analysis module for analyzing a user's responses to the technical requirements from the requirement module;
a design module for maintaining the set of reusable test scripts used for the automated testing;
a development module for building a test scenario for an application under test by associating the set of reusable test scripts to the plurality of manual test cases; and
an execution module for executing the test scenario for the application under test using a licensed automated testing tool associated with the set of reusable test scripts.
20. The method of claim 1, wherein displaying a return on investment (ROI) for each of the automated testing projects comprises:
displaying the return on investment (ROI) for a plurality of the automated testing projects in a single report.
21. The method of claim 20, wherein the return on investment (ROI) for the plurality of the automated testing projects are displayed via a graphical user interface on a display screen.
22. The method of claim 21, wherein the plurality of manual test cases for the application under test are determined using a graphical user interface user input.
23. The method of claim 15, wherein displaying a return on investment (ROI) for each of the automated testing projects comprises:
displaying the return on investment (ROI) for a plurality of the automated testing projects in a single report.
24. The method of claim 23, wherein the return on investment (ROI) for the plurality of the automated testing projects are displayed via a graphical user interface on a display screen.
25. The method of claim 24, wherein the plurality of manual test cases for the application under test are determined using a graphical user interface user input.
US14/801,025 2009-03-09 2015-07-16 Lifecycle management of automated testing Active 2030-09-22 USRE46849E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/801,025 USRE46849E1 (en) 2009-03-09 2015-07-16 Lifecycle management of automated testing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/399,982 US8347147B2 (en) 2009-03-09 2009-03-09 Lifecycle management of automated testing
US14/801,025 USRE46849E1 (en) 2009-03-09 2015-07-16 Lifecycle management of automated testing

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/399,982 Reissue US8347147B2 (en) 2009-03-09 2009-03-09 Lifecycle management of automated testing

Publications (1)

Publication Number Publication Date
USRE46849E1 true USRE46849E1 (en) 2018-05-15

Family

ID=42679371

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/399,982 Ceased US8347147B2 (en) 2009-03-09 2009-03-09 Lifecycle management of automated testing
US14/801,025 Active 2030-09-22 USRE46849E1 (en) 2009-03-09 2015-07-16 Lifecycle management of automated testing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/399,982 Ceased US8347147B2 (en) 2009-03-09 2009-03-09 Lifecycle management of automated testing

Country Status (1)

Country Link
US (2) US8347147B2 (en)

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120005055A1 (en) * 2010-06-30 2012-01-05 International Business Machines Corporation Dynamic computation of roi for test automation
US9519476B2 (en) 2011-03-29 2016-12-13 Hewlett Packard Enterprise Development Lp Methods, apparatus and articles of manufacture to autocomplete application lifecycle management entities
US8719795B2 (en) * 2011-04-12 2014-05-06 Miami International Security Exchange, Llc System and method for automating testing of computers
US9448915B2 (en) * 2011-04-13 2016-09-20 Accenture Global Services Limited Modular script designer for next generation testing system
US9183124B2 (en) * 2011-04-18 2015-11-10 Accenture Global Services Limited Automation controller for next generation testing system
US9104811B1 (en) * 2011-05-08 2015-08-11 Panaya Ltd. Utilizing testing data collected from different organizations to generate test scenario templates that suit a user profile
US9038026B2 (en) * 2011-10-17 2015-05-19 International Business Machines Corporation System and method for automating test automation
CN103106138B (en) * 2011-11-15 2016-03-09 阿里巴巴集团控股有限公司 The method that test case is synchronous with test script and device
TW201322125A (en) * 2011-11-18 2013-06-01 Primax Electronics Ltd Automatic testing method
CN103136077A (en) * 2011-11-25 2013-06-05 致伸科技股份有限公司 Automatic test method
EP2738672B1 (en) * 2012-11-30 2016-09-14 Accenture Global Services Limited Communications network, computer architecture, computer-implemented method and computer program product for development and management of femtocell-based applications
GB2513404A (en) * 2013-04-26 2014-10-29 Ibm Generating test scripts through application integration
CN104123219B (en) 2013-04-28 2017-05-24 国际商业机器公司 Method and device for testing software
US9785543B2 (en) * 2013-10-10 2017-10-10 Oracle International Corporation Dual tagging between test and pods
WO2015065364A1 (en) * 2013-10-30 2015-05-07 Hewlett-Packard Development Company, L.P. Recording an application test
US20150135164A1 (en) * 2013-11-08 2015-05-14 Halliburton Energy Services, Inc. Integrated Software Testing Management
CN104850490B (en) 2014-02-18 2017-11-24 国际商业机器公司 Method and system for software test
US9753842B2 (en) * 2014-05-09 2017-09-05 Wipro Limited System and method for creating universal test script for testing variants of software application
EP3021225B1 (en) * 2014-11-14 2020-07-01 Mastercard International, Inc. Automated configuration code based selection of test cases for payment terminals
CN104461908B (en) * 2014-12-31 2017-04-05 中国科学院软件研究所 A kind of regression test case method for reusing tested based on combination of software
CN104657141B (en) * 2015-02-12 2017-09-26 金光 A kind of gui software wrapper and its implementation based on computer vision
US9892015B2 (en) 2015-03-16 2018-02-13 Microsoft Technology Licensing, Llc Integration and automation of build and test in a customized computing system
US9858090B2 (en) * 2015-06-02 2018-01-02 International Business Machines Corporation Generating customized on-demand videos from automated test scripts
IN2015CH04673A (en) * 2015-09-03 2015-09-11 Wipro Ltd
US10289534B1 (en) * 2015-10-29 2019-05-14 Amdocs Development Limited System, method, and computer program for efficiently automating business flow testing
US10095482B2 (en) * 2015-11-18 2018-10-09 Mastercard International Incorporated Systems, methods, and media for graphical task creation
CN105930260B (en) * 2015-12-23 2018-12-28 中国银联股份有限公司 A kind of system availability test method and device
US10229038B2 (en) 2016-03-15 2019-03-12 International Business Machines Corporation Generating reusable testing procedures
US10672013B2 (en) * 2016-07-14 2020-06-02 Accenture Global Solutions Limited Product test orchestration
GB2553896B (en) 2016-07-14 2019-09-25 Accenture Global Solutions Ltd Product test orchestration
US10776251B1 (en) * 2016-07-22 2020-09-15 Amdocs Development Limited System, method, and computer program for automatically converting manual test cases to automated test structures in a software testing project
US10613966B2 (en) * 2017-02-09 2020-04-07 Wipro Limited Method of controlling automation of testing applications and a system therefor
US10204033B2 (en) 2017-03-11 2019-02-12 Wipro Limited Method and system for semantic test suite reduction
CN107301130A (en) * 2017-06-28 2017-10-27 歌尔科技有限公司 A kind of VR all-in-ones testing tool and its method
CN108287788A (en) * 2017-12-26 2018-07-17 广东睿江云计算股份有限公司 A kind of use-case step matching method based on test case, system
US11383377B2 (en) * 2018-10-09 2022-07-12 Jpmorgan Chase Bank, N.A. System and method for bot automation lifecycle management
CN111309586B (en) * 2018-12-12 2024-07-19 迈普通信技术股份有限公司 Command testing method and device and storage medium thereof
US11074162B2 (en) * 2019-04-15 2021-07-27 Cognizant Technology Solutions India Pvt. Ltd. System and a method for automated script generation for application testing
US11113762B2 (en) * 2019-10-25 2021-09-07 Raisin Technology Europe, S.L. System and method for creating on-demand user-customized deposit strategies using data extracted from one or more independent systems
CN110795000B (en) * 2019-10-28 2021-03-12 珠海格力电器股份有限公司 Automatic control method and device based on interface segmentation and terminal
CN113032240A (en) * 2019-12-09 2021-06-25 中车时代电动汽车股份有限公司 Software test automation management system and method
US11392485B2 (en) 2020-06-01 2022-07-19 Cerner Innovation, Inc. Auto test generator
CN113656309A (en) * 2021-08-19 2021-11-16 蔚来汽车科技(安徽)有限公司 Test case life cycle iteration management method and system and medium
CN114116521B (en) * 2021-12-17 2025-06-06 北京数码大方科技股份有限公司 Automated testing method and device for management software
US20240330157A1 (en) * 2023-03-31 2024-10-03 Infosys Limited Method and system for calculation of network test automation feasibility and maturity indices

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6694509B1 (en) * 1999-12-28 2004-02-17 Ge Medical Systems Global Technology Company Llc Automated regression testing of workstation software
US20040034543A1 (en) * 2002-01-15 2004-02-19 Koninklijke Ahold Nv Methodology to design, construct, and implement human resources business procedures and processes
US20040107415A1 (en) * 2002-12-03 2004-06-03 Konstantin Melamed Web-interactive software testing management method and computer system including an integrated test case authoring tool
US20050166094A1 (en) * 2003-11-04 2005-07-28 Blackwell Barry M. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems
US20060123389A1 (en) * 2004-11-18 2006-06-08 Kolawa Adam K System and method for global group reporting
US20060253742A1 (en) * 2004-07-16 2006-11-09 International Business Machines Corporation Automating modular manual tests including framework for test automation
US20070006038A1 (en) * 2005-06-29 2007-01-04 Zhengrong Zhou Methods and apparatus using a hierarchical test development tree to specify devices and their test setups
US7296188B2 (en) * 2002-07-11 2007-11-13 International Business Machines Corporation Formal test case definitions
US20080222454A1 (en) * 2007-03-08 2008-09-11 Tim Kelso Program test system
US20080222608A1 (en) * 2001-10-05 2008-09-11 Jason Michael Gartner Method and system for managing software testing
US7493521B1 (en) * 2005-06-23 2009-02-17 Netapp, Inc. Apparatus and method for estimating the testing proficiency of a software test according to EMS messages extracted from a code base
US20090249297A1 (en) * 2008-03-25 2009-10-01 Lehman Brothers Inc. Method and System for Automated Testing of Computer Applications
US7895565B1 (en) * 2006-03-15 2011-02-22 Jp Morgan Chase Bank, N.A. Integrated system and method for validating the functionality and performance of software applications
US8615738B2 (en) * 2004-07-16 2013-12-24 International Business Machines Corporation System and method for software product test modularization

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6694509B1 (en) * 1999-12-28 2004-02-17 Ge Medical Systems Global Technology Company Llc Automated regression testing of workstation software
US20080222608A1 (en) * 2001-10-05 2008-09-11 Jason Michael Gartner Method and system for managing software testing
US20040034543A1 (en) * 2002-01-15 2004-02-19 Koninklijke Ahold Nv Methodology to design, construct, and implement human resources business procedures and processes
US7296188B2 (en) * 2002-07-11 2007-11-13 International Business Machines Corporation Formal test case definitions
US20040107415A1 (en) * 2002-12-03 2004-06-03 Konstantin Melamed Web-interactive software testing management method and computer system including an integrated test case authoring tool
US20050166094A1 (en) * 2003-11-04 2005-07-28 Blackwell Barry M. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems
US20060253742A1 (en) * 2004-07-16 2006-11-09 International Business Machines Corporation Automating modular manual tests including framework for test automation
US8615738B2 (en) * 2004-07-16 2013-12-24 International Business Machines Corporation System and method for software product test modularization
US20060123389A1 (en) * 2004-11-18 2006-06-08 Kolawa Adam K System and method for global group reporting
US7493521B1 (en) * 2005-06-23 2009-02-17 Netapp, Inc. Apparatus and method for estimating the testing proficiency of a software test according to EMS messages extracted from a code base
US20070006038A1 (en) * 2005-06-29 2007-01-04 Zhengrong Zhou Methods and apparatus using a hierarchical test development tree to specify devices and their test setups
US7895565B1 (en) * 2006-03-15 2011-02-22 Jp Morgan Chase Bank, N.A. Integrated system and method for validating the functionality and performance of software applications
US20080222454A1 (en) * 2007-03-08 2008-09-11 Tim Kelso Program test system
US20090249297A1 (en) * 2008-03-25 2009-10-01 Lehman Brothers Inc. Method and System for Automated Testing of Computer Applications

Also Published As

Publication number Publication date
US20100229155A1 (en) 2010-09-09
US8347147B2 (en) 2013-01-01

Similar Documents

Publication Publication Date Title
USRE46849E1 (en) Lifecycle management of automated testing
US11126543B2 (en) Software test automation system and method
Alegroth et al. JAutomate: A tool for system-and acceptance-test automation
US8677320B2 (en) Software testing supporting high reuse of test data
US11138097B2 (en) Automated web testing framework for generating and maintaining test scripts
US20100180260A1 (en) Method and system for performing an automated quality assurance testing
US20090193389A1 (en) Realtime creation of datasets in model based testing
US20120254829A1 (en) Method and system to produce secure software applications
US10380526B2 (en) System and method for providing a process player for use with a business process design environment
Strauch et al. Decision support for the migration of the application database layer to the cloud
EP2913757A1 (en) Method, system, and computer software product for test automation
Grieskamp et al. Model-based quality assurance of Windows protocol documentation
Cai et al. Analysis for cloud testing of web application
Uğur-Tuncer et al. Intelligent test automation for improved software quality assurance
Atar Hands-on test management with Jira: end-to-end test management with Zephyr, synapseRT, and Jenkins in Jira
CN116089270A (en) A layered testing system, method and medium
Mathrani Quality assurance strategy for distributed software development using managed test lab model
Scherma Design and implementation of an integrated DevOps framework for Digital Twins as a Service software platform
Zielińska Framework for Extensible Application Testing
Dobrzyński et al. Tracing project development in Scrum model
Jibin et al. INTEGRATE CUCUMBER WITH SELENIUM WEBDRIVER FOR WEBSITE AUTOMATION
KK Test Management Using Azure DevOps
US20150227860A1 (en) Defect turnaround time analytics engine
CN119557209A (en) Integrated method for interface and interface automation testing based on deep learning
Board Standard glossary of terms used in Software Testing

Legal Events

Date Code Title Description
AS Assignment

Owner name: WIPRO LIMITED, INDIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ADIYAPATHAM, PANDIYAN;DEVARAJAN, VIVEK;SINGHANIA, SFOORTI;AND OTHERS;SIGNING DATES FROM 20090205 TO 20150317;REEL/FRAME:045171/0469

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12