US20240420090A1 - System and method for ai-based matching of employment candidates - Google Patents
System and method for ai-based matching of employment candidates Download PDFInfo
- Publication number
- US20240420090A1 US20240420090A1 US18/210,273 US202318210273A US2024420090A1 US 20240420090 A1 US20240420090 A1 US 20240420090A1 US 202318210273 A US202318210273 A US 202318210273A US 2024420090 A1 US2024420090 A1 US 2024420090A1
- Authority
- US
- United States
- Prior art keywords
- employment
- data
- candidate
- node
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/105—Human resources
- G06Q10/1053—Employment or hiring
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063112—Skill-based matching of a person or a group to a task
Definitions
- the present disclosure generally relates to determination and placement of qualified employment candidates, and more particularly, to an AI-based automated system for real-time matching of candidates with employment requirements based on predictive analytics of job candidate-related historical heuristic data.
- the process of finding and hiring employees to fill an open position can be inefficient for the employer. This process becomes even more inefficient when emergency and medical personnel need to be employed to a site of an emergency situation such as a disaster recovery.
- Employers are often tasked with finding qualified personnel to fill a temporary, contract, and/or full-time position(s) using online job boards, or more traditional means.
- various tasks exist including gathering resumes, vetting candidates for relevant skills and experience, and determining if the applicant has appropriate licenses, etc.
- One embodiment of the present disclosure provides a system for an automated matching of employment candidates to an employer including a processor of a matching server node configured to host a machine learning (ML) module and a memory on which are stored machine-readable instructions that when executed by the processor, cause the processor to: receive employment request data from the employer entity node; parse the employment request data to derive a plurality of features; query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features; generate at least one feature vector based on the plurality of features and the historical candidate-related data; and provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- ML machine learning
- Another embodiment of the present disclosure provides a method that includes one or more of: receiving employment request data from the employer entity node; parsing the employment request data to derive a plurality of features; querying a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features; generating at least one feature vector based on the plurality of features and the historical candidate-related data; and providing the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- Another embodiment of the present disclosure provides a computer-readable medium including instructions for receiving employment request data from the employer entity node; parsing the employment request data to derive a plurality of features; querying matched candidates database to retrieve historical candidate-related data collected from previous job engagements based on the plurality of features; generating at least one feature vector based on the plurality of features and the historical candidate-related data; and providing the at least one feature vector to the ML module for generating a predictive model configured to produce at least one job engagement parameter for generation of an employment-related notification to the at least one candidate entity node.
- drawings may contain text or captions that may explain certain embodiments of the present disclosure. This text is included for illustrative, non-limiting, explanatory purposes of certain embodiments detailed in the present disclosure.
- drawings may contain text or captions that may explain certain embodiments of the present disclosure. This text is included for illustrative, non-limiting, explanatory purposes of certain embodiments detailed in the present disclosure.
- FIG. 1 A illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates, consistent with the present disclosure
- FIG. 1 B illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates and receiving employment consensus over a blockchain consistent with the present disclosure
- FIG. 2 illustrates a network diagram of a system including detailed features of a matching server (MS) node consistent with the present disclosure
- FIG. 3 A illustrates a flowchart of a method for an AI-based automated matching of employment candidates to an employment request consistent with the present disclosure
- FIG. 3 B illustrates a further flow chart of a method for the automated matching of the employment candidates to the employment request consistent with the present disclosure
- FIG. 4 illustrates deployment of a machine learning model for prediction of employment parameters using blockchain assets consistent with the present disclosure
- FIG. 5 illustrates a block diagram of a system including a computing device for performing the method of FIGS. 3 A and 3 B .
- any embodiment may incorporate only one or a plurality of the above-disclosed aspects of the disclosure and may further incorporate only one or a plurality of the above-disclosed features.
- any embodiment discussed and identified as being “preferred” is considered to be part of a best mode contemplated for carrying out the embodiments of the present disclosure.
- Other embodiments also may be discussed for additional illustrative purposes in providing a full and enabling disclosure.
- many embodiments, such as adaptations, variations, modifications, and equivalent arrangements, will be implicitly disclosed by the embodiments described herein and fall within the scope of the present disclosure.
- any sequence(s) and/or temporal order of steps of various processes or methods that are described herein are illustrative and not restrictive. Accordingly, it should be understood that, although steps of various processes or methods may be shown and described as being in a sequence or temporal order, the steps of any such processes or methods are not limited to being carried out in any particular sequence or order, absent an indication otherwise. Indeed, the steps in such processes or methods generally may be carried out in various different sequences and orders while still falling within the scope of the present invention. Accordingly, it is intended that the scope of patent protection is to be defined by the issued claim(s) rather than the description set forth herein.
- the present disclosure includes many aspects and features. Moreover, while many aspects and features relate to, and are described in, the context of processing job applicants, embodiments of the present disclosure are not limited to use only in this context.
- the present disclosure provides a system, method and computer-readable medium for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates.
- the system provides for AI and machine learning (ML)-generated list of employment parameters to be used for analysis and generation of employment-related notifications.
- an automated decision model may be generated to provide for employment parameters associated with employment candidate current status, past employment-related behavior based on candidate's current qualifications (licenses, diplomas, certificates, etc.) and/or special knowledge (e.g., Spanish, sign language, crisis management, risk assessment, DOT examination, etc.) or skills (e.g., joint injection, infusion therapy, translation, suturing, etc.), previous employment, employees' feedback, reports, reviews, social media accounts, etc.
- special knowledge e.g., Spanish, sign language, crisis management, risk assessment, DOT examination, etc.
- skills e.g., joint injection, infusion therapy, translation, suturing, etc.
- the automated notification decision model may use historical employment candidates' data collected at the current locations (i.e., a hospital or other emergency jobs site) or work setting, and at other medical/emergency facilities of the same type located within a certain range from the current location or even located globally.
- the relevant employment candidate's data may include data related to other employment candidates having the same parameters such as specialty, age, race, gender, preferred employment conditions or locations, etc.
- the AI/ML technology may be combined with a blockchain technology for secure use of the employment (job engagement) candidate-related data.
- the disclosed embodiment may produce a detailed safety or success rates score on the successful employment likelihood for the given employment candidate based on collected candidate's behavioral data. This allows for direct reporting on a trust level of the given employment candidate to the hiring authorities/employing entities (i.e., physicians, hospitals, clinics, emergency services, other patient care organizations, etc.).
- the employing entities may be connected to the matching server (MS) node over a blockchain network to achieve a consensus prior to executing a transaction to release the employment candidate or the employment conditions for the candidate based on the employment parameters.
- the employment parameters may be defined by hashtags.
- the system utilizes descriptive hashtags to aid in matching candidates with job opportunities for which they are interested in and qualify for based on being on-boarded to the system via a blockchain network.
- the disclosed system relates to a SaaS platform that matches potential employees with employers.
- the platform may employ a bidding system that incorporates the use of hashtags by employers and potential employees.
- Employers and potential employees can use four distinct hashtag categories: language, special skill, special knowledge, and “other.”
- employers can post jobs or generate employment requests using the unique hashtags and set the amount of pay for each job.
- Potential employees may be able to set up profiles using the unique hashtags and the price they expect to be paid for completing jobs.
- the platform matches potential employees to employers based on the hashtag and pay parameters that are processed through an AI machine-learning module that may automatically choose the candidate that best fit the employment requirements most accurately.
- the platform also allows for employers to leave the pay for each job blank which allows potential employees to bid on jobs.
- Employers can receive AI-generated recommendations for candidates based on their hashtags and bid amounts.
- the matching server may automatically choose the candidate(s) that best fits the employment parameters derived from the initial employment requirements.
- the disclosed embodiments provide an online peer-to-peer job to employee matching platform that all allows employers to be in direct contact with potential employees utilizing a system of hashtags that represent skills, language, special knowledge and “other” to allow the employers to quickly and accurately vet potential employees without the need of reading resumes and negotiating pay.
- One embodiment is directed to a technology-driven bid/counterbid marketplace process where employers of any size may efficiently and quickly get directly matched with one or any number of vetted, on-demand independent contractors (skilled labor) for a gig job or assignment which requires unique or specific skills, knowledge or level of experience.
- an employer and contractor may select from the pre-populated approved list of descriptive hashtags or create their own unique hashtag to enable customized matching based on a specific criteria or demand.
- the employer may provide the hashtags as part of the employment request.
- the hashtags may be grouped into four distinct categories: Language, Special Skill, Special Knowledge, and “Other.”
- employers may choose any type or number of hashtags combination to be used in the employment request to get the best desired matches via AI-based recommendations.
- Contractors in addition to required identity verification documents, demographics, profession type, license or certificate (if applicable), may add to their personal profile any number of optional hashtags which highlight their unique skill, knowledge and experience. This can be implemented at on-boarding into the SaaS platform that may be implemented based on the underlying blockchain network.
- the hashtags may be parsed out or generated based on the employment request used as part of the matching algorithm implemented by the AI module.
- employer has an option to review profile and pick or choose which matched contractor should get notified or send notification to any number of matched contractors simultaneously.
- Contractor may be notified of the matched available gig over a blockchain network.
- the employer is notified and may execute a blockchain transaction for signing the contractor to the assignment or the gig job.
- the employer has an option to list the opening renumeration bid amount for the gig or leave the amount blank for matched contractor to bid on. Leaving the amount blank ensures that employer does not pay more than the contractor is willing to be paid.
- the disclosed process advantageously, eliminates the need for agencies, recruiters, and traditional job boards by enabling employer and contractor to be automatically matched directly on a granular level based on the AI-based predictive analysis and recommendations.
- This process includes transparent pricing mechanism coupled with a secure communications chat channel (implemented over a blockchain network) which supports both parties to set, negotiate, and agree on the price and terms of an engagement with each other using a dynamic bid/counterbid function.
- a major hurricane hits Southern Florida.
- a facility (especially rural) needs a large number of diverse and skilled labor to assist with the situation. Their current work pool is unavailable due to the hurricane aftermath.
- the facility may quickly post any number of gigs with any combination of desired descriptive hashtags in addition to location, time and duration needed for the skilled labor, and/or post a price offered to be paid.
- the facility may simultaneously engage with any number of matched workers, bypassing recruiters, job boards and staffing agencies, to quickly achieve best desired outcome.
- a Spanish-speaking patient has been discharged from the hospital and requires a 1-week home-based antibiotic therapy.
- a nursing agency needs to quickly find and dispatch a nurse to the patient within 12 hours to fix the IV which stopped working.
- an agency may post a gig on the marketplace requiring a match with an active nurse in the nearby geographical area, and add three descriptive hashtags into the employment request to maximize best outcome (#homecare, #Spanish, #IVinfusion).
- the matching server receives a recommendation from the AI module and shows three matched nurses who meet the exact hashtag criteria who were already onboarded into the platform.
- Employer receives price bids for the amount each nurse is willing to perform the gig for.
- the employer may then review each profile, decides on one, and accepts the bid amount from the nurse because it is at or below the price the employer was willing to pay a contractor.
- the disclosed embodiments streamline for the employer more accurate on-demand gig worker match based on specific requirements, especially during emergency or mass casualty event when a large pool of diverse workers is urgently needed. It saves valuable time sourcing candidates, minimizes administrative hassle, saves money on job posts advertisement, and reduces the overhead labor cost.
- the disclosed system increases renumeration by eliminating middleman (job boards, recruiters, agencies); improves flexibility and control over engagement terms; improves work/life balance; avoids lengthy contracts.
- the system reduces the employer's risk of engaging with an unknown individual who may have the right credentials on the surface, but lacks the specific type of skill, knowledge or experience required to successfully complete the job. It enables the contractor to choose to engage for the type of work they most desire, experienced in or enjoy doing.
- the proposed employment candidates' matching system may be, advantageously, used for the following non-limiting use cases: short term gig assignment, emergency or disaster recovery situation on site deployment, long term full-time employment, long term contractual employment, one time consulting assignment, etc.
- FIG. 1 A illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates, consistent with the present disclosure.
- the example network 100 includes the matching server (MS) node 102 connected to a cloud server node(s) 105 over a network.
- the MS node 102 is configured to host an AI/ML module 107 .
- the MS node 102 may receive employment request data from an employer entity 111 .
- the employment request data may have hashtags representing the employment parameters.
- the employment request data may be processed by the MS node 102 to parse out the hashtags or to generate hashtags from the job description.
- the MS node 102 may query a local candidates' database for the historical local candidates' data 103 associated with the current employment request data.
- the MS node 102 may acquire relevant remote candidates' data 106 from a remote database residing on a cloud server 105 .
- the remote candidates' data 106 may be collected from other employer entities (e.g., medical facilities).
- the remote candidates' data 106 may be collected from employment candidates that had the same (or similar) qualifications, age, gender, race, etc. as the local candidates' who are associated with the current employment request data.
- the MS node 102 may generate a feature vector or classifier data based on the employment request data and the collected candidates' data (i.e., pre-stored local data 103 and remote data 106 ).
- the MS node 102 may ingest the feature vector data into an AI/ML module 107 .
- the AI/ML module 107 may generate a predictive model(s) 108 based on the feature vector data to predict employment parameters for automatically generating a notification(s) to be provided to employment candidates 113 (e.g., nurses, doctor(s), care providers, etc.).
- the employment parameters may be further analyzed by the MS node 102 prior to generation of the notification(s).
- the employment parameters may be used for adjustment of the hiring schedule based on availability of the selected (i.e., matched) employment candidates.
- FIG. 1 B illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates and receiving employment consensus over a blockchain consistent with the present disclosure.
- the example network 100 ′ includes the matching server (MS) node 102 connected to a cloud server node(s) 105 over a network.
- the MS node 102 is configured to host an AI/ML module 107 .
- the MS node 102 may receive employment request data from an employer entity 111 .
- the employment request data may have hashtags representing the employment parameters.
- the employment request data may be processed by the MS node 102 to parse out the hashtags or to generate hashtags from the job description.
- the MS node 102 may query a local candidates' database for the historical local candidates' data 103 associated with the current employment request data.
- the MS node 102 may acquire relevant remote candidates' data 106 from a remote database residing on a cloud server 105 .
- the remote candidates' data 106 may be collected from other employer entities (e.g., medical facilities).
- the remote candidates' data 106 may be collected from employment candidates that had the same (or similar) qualifications, age, gender, race, etc. as the local candidates' who are associated with the current employment request data.
- the MS node 102 may generate a feature vector or classifier data based on the employment request data and the collected candidates' data (i.e., pre-stored local data 103 and remote data 106 ).
- the MS node 102 may ingest the feature vector data into an AI/ML module 107 .
- the AI/ML module 107 may generate a predictive model(s) 108 based on the feature vector data to predict employment parameters for automatically generating a notification(s) to be provided to employment candidates 113 (e.g., nurses, doctor(s), care providers, etc.).
- the employment parameters may be further analyzed by the MS node 102 prior to generation of the notification(s).
- the employment parameters may be used for adjustment of the hiring schedule based on availability of the selected (i.e., matched) employment candidates.
- the MS node 102 may receive the predicted employment parameters from a permissioned blockchain 110 ledger 109 based on a consensus from the employment candidates' (devices) 113 confirming, for example, dates and compensation for employment. Additionally, confidential historical candidate-related information and previous candidate-related employment parameters may also be acquired from the permissioned blockchain 110 . The newly acquired candidate-related data with corresponding predicted employment parameters data may be also recorded on the ledger 109 of the blockchain 110 so it can be used as training data for the predictive model(s) 108 .
- the MS node 102 , the cloud server 105 , the employment candidate devices 113 and employer entities(s) 111 may serve as blockchain 110 peer nodes.
- local candidates' data 103 and remote candidates' data 106 may be duplicated on the blockchain ledger 109 for higher security of storage.
- the AI/ML module 107 may generate a predictive model(s) 108 to predict the employment parameters for the employment candidates in response to the specific relevant pre-stored candidates'-related data acquired from the blockchain 110 ledger 109 .
- the current employment parameters may be predicted based not only on the current employment request-related data and current candidates'-related data, but also based on the previously collected heuristics and candidates'-related data associated with the given employment request data or current employment request parameters derived from the employment request data.
- FIG. 2 illustrates a network diagram of a system including detailed features of a matching server (MS) node consistent with the present disclosure.
- MS matching server
- the example network 200 includes the MS node 102 connected to employer device(s) to receive employment request data 201 .
- the MS node 102 is configured to host an AI/ML module 107 .
- the MS node 102 may receive employment request data provided by the employer entities 101 ( FIG. 1 A ) and pre-stored candidates' data retrieved from local and remote databases.
- the pre-stored candidates' data may be retrieved from the ledger 109 of the blockchain 110 .
- the AI/ML module 107 may generate a predictive model(s) 108 based on the received employment request data 201 and the candidates'-related data provided by the MS node 102 . As discussed above, the AI/ML module 107 may provide predictive outputs data in a form of employment parameters for automatic generation of notifications for the employment candidates or for adjusting the employment schedule for the candidates. The MS node 102 may process the predictive outputs data received from the AI/ML module 107 to generate the notification of a current risk assessment ranking pertaining to a particular matched employment candidate.
- the MS node 102 may acquire employment request data from the employment entities periodically in order to check if new notifications need to be generated or the hiring schedule needs to be reset.
- the MS node 102 may continually monitor candidates'-related data acquired from databases/blockchain ledger and may detect a parameter that deviates from a previous recorded parameter (or from a median reading value) by a margin that exceeds a threshold value pre-set for this particular parameter. For example, if a candidate's declared employment duration changes this may cause a drastic change in this candidate's employment parameters. As another non-limiting example, a significant increase in candidate's desired compensation??? may also cause critical changes in candidate's employment possibilities.
- the MS node 102 may provide the currently acquired employment candidate parameter to the AI/ML module 107 to generate a list of updated employment parameters based on the current candidate's employment conditions and requirements.
- the MS node 102 may be a computing device or a server computer, or the like, and may include a processor 204 , which may be a semiconductor-based microprocessor, a central processing unit (CPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or another hardware device. Although a single processor 204 is depicted, it should be understood that the MS node 102 may include multiple processors, multiple cores, or the like, without departing from the scope of the MS node 102 system.
- the MS node 102 may also include a non-transitory computer readable medium 212 that may have stored thereon machine-readable instructions executable by the processor 204 . Examples of the machine-readable instructions are shown as 214 - 222 and are further discussed below. Examples of the non-transitory computer readable medium 212 may include an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. For example, the non-transitory computer readable medium 212 may be a Random-Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a hard disk, an optical disc, or other type of storage device.
- RAM Random-Access memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- the processor 204 may fetch, decode, and execute the machine-readable instructions 214 to receive employment request data from the employer entity node 111 .
- the processor 204 may fetch, decode, and execute the machine-readable instructions 216 to parse the employment request data to derive a plurality of features.
- the processor 204 may fetch, decode, and execute the machine-readable instructions 218 to query a local candidates database 103 to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features.
- the processor 204 may fetch, decode, and execute the machine-readable instructions 220 to generate at least one feature vector based on the plurality of features and the historical candidate-related data.
- the processor 204 may fetch, decode, and execute the machine-readable instructions 222 to provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- the permissioned blockchain 110 may be configured to use one or more smart contracts that manage transactions for multiple participating nodes and for recording the transactions on the ledger 109 .
- FIG. 3 A illustrates a flowchart of a method for an AI-based automated matching of employment candidates to an employment request consistent with the present disclosure.
- FIG. 3 A illustrates a flow chart of an example method executed by the MS 102 (see FIG. 2 ). It should be understood that method 300 depicted in FIG. 3 A may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scope of the method 300 . The description of the method 300 is also made with reference to the features depicted in FIG. 2 for purposes of illustration. Particularly, the processor 204 of the MS node 102 may execute some or all of the operations included in the method 300 .
- the processor 204 may receive employment request data from the employer entity node.
- the processor 204 may parse the employment request data to derive a plurality of features.
- the processor 204 may query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features.
- the processor 204 may generate at least one feature vector based on the plurality of features and the historical candidate-related data.
- the processor 204 may provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- FIG. 3 B illustrates a further flow chart of a method for the automated matching of the employment candidates to the employment request consistent with the present disclosure.
- the method 300 ′ may include one or more of the steps described below.
- FIG. 3 B illustrates a flow chart of an example method executed by the MS 102 (see FIG. 2 ). It should be understood that method 300 ′ depicted in FIG. 3 B may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scope of the method 300 ′. The description of the method 300 ′ is also made with reference to the features depicted in FIG. 2 for purposes of illustration. Particularly, the processor 204 of the MS 102 may execute some or all of the operations included in the method 300 ′.
- the processor 204 may generate at least one rescheduling parameter for resetting a hiring schedule associated with the employment request data based on the at least one employment parameter.
- the processor 204 may retrieve remote historical candidate-related data from at least one remote candidates' database based on the local historical candidate-related data, wherein the remote historical candidate-related data is collected at locations associated with a plurality of employer entities affiliated with the employer entity node.
- the processor 204 may generate the at least one feature vector based on the plurality of features, the local historical candidate-related data combined with the remote historical candidate-related data.
- the processor 204 may parse the employment request data to generate a plurality of hashtags.
- the processor 204 may generate the plurality of features based on the plurality of hashtags.
- the processor 204 may continuously monitor incoming employment request data to determine if at least one value of employment request parameters deviates from a previous value of a previous employment request data by a margin exceeding a pre-set threshold value.
- the processor 204 may, responsive to the at least one value deviating from the previous value by the margin exceeding the pre-set threshold value, generate an updated feature vector based on the incoming employment request data and generate the notification based on the at least one employment parameter produced by the predictive model in response to the updated feature vector.
- the processor 204 may record the at least one employment parameter on a blockchain ledger along with the employment request data.
- the processor 204 may retrieve the at least one employment parameter from the blockchain responsive to a consensus among the employer entity node and the at least one candidate entity node.
- the processor 204 may execute a smart contract to record data reflecting rescheduling of hiring of a candidate associated with the at least one candidate entity node on the blockchain for future audits.
- the employment parameters' model may be generated by the AI/ML module 107 that may use training data sets to improve accuracy of the prediction of the employment parameters for the candidate 113 ( FIG. 1 A ).
- the employment parameters used in training data sets may be stored in a centralized local database (such as one used for storing local candidates' data 103 depicted in FIG. 1 A ).
- a neural network may be used in the AI/ML module 107 for employment parameters modeling and hiring rescheduling predictions.
- the AI/ML module 107 may use a decentralized storage such as a blockchain 110 (see FIG. 1 B ) that is a distributed storage system, which includes multiple nodes that communicate with each other.
- the decentralized storage includes an append-only immutable data structure resembling a distributed ledger capable of maintaining records between mutually untrusted parties.
- the untrusted parties are referred to herein as peers or peer nodes.
- Each peer maintains a copy of the parameter(s) records and no single peer can modify the records without a consensus being reached among the distributed peers.
- the peers 101 , 113 and 102 may execute a consensus protocol to validate blockchain 110 storage transactions, group the storage transactions into blocks, and build a hash chain over the blocks.
- a permissioned and/or a permissionless blockchain can be used.
- a public or permissionless blockchain anyone can participate without a specific identity.
- Public blockchains can involve assets and use consensus based on various protocols such as Proof of Work (PoW).
- PoW Proof of Work
- a permissioned blockchain provides secure interactions among a group of entities which share a common goal such as storing alert parameters for efficient monitoring of a patient, but which do not fully trust one another.
- This application utilizes a permissioned (private) blockchain that operates arbitrary, programmable logic, tailored to a decentralized storage scheme and referred to as “smart contracts” or “chaincodes.”
- chaincodes may exist for management functions and parameters which are referred to as system chaincodes.
- the application can further utilize smart contracts that are trusted distributed applications which leverage tamper-proof properties of the blockchain database and an underlying agreement between nodes, which is referred to as an endorsement or endorsement policy.
- Blockchain transactions associated with this application can be “endorsed” before being committed to the blockchain while transactions, which are not endorsed, are disregarded.
- An endorsement policy allows chaincodes to specify endorsers for a transaction in the form of a set of peer nodes that are necessary for endorsement.
- a host platform 420 (such as the MS node 102 ) builds and deploys a machine learning model for predictive monitoring of assets 430 .
- the host platform 420 may be a cloud platform, an industrial server, a web server, a personal computer, a user device, and the like.
- Assets 430 can represent notifications or candidates' employment parameters.
- the blockchain 110 can be used to significantly improve both a training process 402 of the machine learning model and the employment parameters' predictive process 405 based on a trained machine learning model.
- historical data may be stored by the assets 430 themselves (or through an intermediary, not shown) on the blockchain 110 .
- data can be directly and reliably transferred straight from its place of origin (e.g., from the employment entities or from candidates' database) to the blockchain 110 .
- smart contracts may directly send the data from the assets to the entities that use the data for building a machine learning model. This allows for sharing of data among the assets 430 .
- the collected data may be stored in the blockchain 110 based on a consensus mechanism.
- the consensus mechanism pulls in (permissioned nodes) to ensure that the data being recorded is verified and accurate.
- the data recorded is time-stamped, cryptographically signed, and immutable. It is therefore auditable, transparent, and secure.
- training of the machine learning model on the collected data may take rounds of refinement and testing by the host platform 420 . Each round may be based on additional data or data that was not previously considered to help expand the knowledge of the machine learning model.
- the different training and testing steps (and the data associated therewith) may be stored on the blockchain 110 by the host platform 420 .
- Each refinement of the machine learning model (e.g., changes in variables, weights, etc.) may be stored on the blockchain 110 . This provides verifiable proof of how the model was trained and what data was used to train the model.
- the host platform 420 has achieved a finally trained model, the resulting model itself may be stored on the blockchain 110 .
- the model After the model has been trained, it may be deployed to a live environment where it can make employment-related predictions/decisions based on the execution of the final trained machine learning model using the employment parameters.
- data fed back from the asset 430 may be input into the machine learning model and may be used to make event predictions such as most optimal candidate employment and scheduling parameters for re-setting the employment contracts for the given employment request.
- Determinations made by the execution of the machine learning model (e.g., notification or rescheduling parameters, etc.) at the host platform 420 may be stored on the blockchain 110 to provide auditable/verifiable proof.
- the machine learning model may predict a future change of a part of the asset 430 (the alert parameters—i.e., assessment of risk of employment).
- the data behind this decision may be stored by the host platform 420 on the blockchain 110 .
- the features and/or the actions described and/or depicted herein can occur on or with respect to the blockchain 110 .
- the above embodiments of the present disclosure may be implemented in hardware, in a computer-readable instructions executed by a processor, in firmware, or in a combination of the above.
- the computer computer-readable instructions may be embodied on a computer-readable medium, such as a storage medium.
- the computer computer-readable instructions may reside in random access memory (“RAM”), flash memory, read-only memory (“ROM”), erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of storage medium known in the art.
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- registers hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of storage medium known in the art.
- CD-ROM compact disk read-only memory
- An exemplary storage medium may be coupled to the processor such that the processor may read information from, and write information to, the storage medium.
- the storage medium may be integral to the processor.
- the processor and the storage medium may reside in an application specific integrated circuit (“ASIC”).
- ASIC application specific integrated circuit
- the processor and the storage medium may reside as discrete components.
- FIG. 5 illustrates an example computing device (e.g., a server node) 500 , which may represent or be integrated in any of the above-described components, etc.
- FIG. 5 illustrates a block diagram of a system including computing device 500 .
- the computing device 500 may comprise, but not be limited to the following:
- Mobile computing device such as, but is not limited to, a laptop, a tablet, a smartphone, a drone, a wearable, an embedded device, a handheld device, an hen, an industrial device, or a remotely operable recording device;
- a supercomputer an exa-scale supercomputer, a mainframe, or a quantum computer
- minicomputer wherein the minicomputer computing device comprises, but is not limited to, an IBM AS500/iSeries/System I, A DEC VAX/PDP, a HP3000, a Honeywell-Bull DPS, a Texas Instruments TI-990, or a Wang Laboratories VS Series;
- microcomputer computing device comprises, but is not limited to, a server, wherein a server may be rack mounted, a workstation, an industrial device, a raspberry pi, a desktop, or an embedded device;
- the MS node 102 may be hosted on a centralized server or on a cloud computing service. Although method 300 has been described to be performed by the MS node 102 implemented on a computing device 500 , it should be understood that, in some embodiments, different operations may be performed by a plurality of the computing devices 500 in operative communication at least one network.
- Embodiments of the present disclosure may comprise a computing device having a central processing unit (CPU) 520 , a bus 530 , a memory unit 550 , a power supply unit (PSU) 550 , and one or more Input/Output (I/O) units.
- the CPU 520 coupled to the memory unit 550 and the plurality of I/O units 560 via the bus 530 , all of which are powered by the PSU 550 .
- each disclosed unit may actually be a plurality of such units for the purposes of redundancy, high availability, and/or performance.
- the combination of the presently disclosed units is configured to perform the stages any method disclosed herein.
- the aforementioned CPU 520 , the bus 530 , the memory unit 550 , a PSU 550 , and the plurality of I/O units 560 may be implemented in a computing device, such as computing device 500 . Any suitable combination of hardware, software, or firmware may be used to implement the aforementioned units.
- the CPU 520 , the bus 530 , and the memory unit 550 may be implemented with computing device 500 or any of other computing devices 500 , in combination with computing device 500 .
- the aforementioned system, device, and components are examples and other systems, devices, and components may comprise the aforementioned CPU 520 , the bus 530 , the memory unit 550 , consistent with embodiments of the disclosure.
- At least one computing device 500 may be embodied as any of the computing elements illustrated in all of the attached figures, including the design server node 102 ( FIG. 2 ).
- a computing device 500 does not need to be electronic, nor even have a CPU 520 , nor bus 530 , nor memory unit 550 .
- the definition of the computing device 500 to a person having ordinary skill in the art is “A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.” Any device which processes information qualifies as a computing device 500 , especially if the processing is purposeful.
- a system consistent with an embodiment of the disclosure may include a computing device, such as computing device 500 .
- computing device 500 may include at least one clock module 510 , at least one CPU 520 , at least one bus 530 , and at least one memory unit 550 , at least one PSU 550 , and at least one I/O 560 module, wherein I/O module may be comprised of, but not limited to a non-volatile storage sub-module 561 , a communication sub-module 562 , a sensors sub-module 563 , and a peripherals sub-module 565 .
- the computing device 500 may include the clock module 510 may be known to a person having ordinary skill in the art as a clock generator, which produces clock signals.
- Clock signal is a particular type of signal that oscillates between a high and a low state and is used like a metronome to coordinate actions of digital circuits.
- Most integrated circuits (ICs) of sufficient complexity use a clock signal in order to synchronize different parts of the circuit, cycling at a rate slower than the worst-case internal propagation delays.
- the preeminent example of the aforementioned integrated circuit is the CPU 520 , the central component of modern computers, which relies on a clock. The only exceptions are asynchronous circuits such as asynchronous CPUs.
- the clock 510 can comprise a plurality of embodiments, such as, but not limited to, single-phase clock which transmits all clock signals on effectively 1 wire, two-phase clock which distributes clock signals on two wires, each with non-overlapping pulses, and four-phase clock which distributes clock Signals on 5 wires.
- clock multiplier which multiplies a lower frequency external clock to the appropriate clock rate of the CPU 520 . This allows the CPU 520 to operate at a much higher frequency than the rest of the computer, which affords performance gains in situations where the CPU 520 does not need to wait on an external factor (like memory 550 or input/output 560 ).
- Some embodiments of the clock 510 may include dynamic frequency change, where, the time between clock edges can vary widely from one edge to the next and back again.
- the computing device 500 may include the CPU unit 520 comprising at least one CPU Core 521 .
- a plurality of CPU cores 521 may comprise identical CPU cores 521 , such as, but not limited to, homogeneous multi-core systems. It is also possible for the plurality of CPU cores 521 to comprise different CPU cores 521 , such as, but not limited to, heterogeneous multi-core systems, big.LITTLE systems and some AMD accelerated processing units (APU).
- the CPU unit 520 reads and executes program instructions which may be used across many application domains, for example, but not limited to, general purpose computing, embedded computing, network computing, digital signal processing (DSP), and graphics processing (GPU).
- DSP digital signal processing
- GPU graphics processing
- the CPU unit 520 may run multiple instructions on separate CPU cores 521 at the same time.
- the CPU unit 520 may be integrated into at least one of a single integrated circuit die and multiple dies in a single chip package.
- the single integrated circuit die and multiple dies in a single chip package may contain a plurality of other aspects of the computing device 500 , for example, but not limited to, the clock 510 , the CPU 520 , the bus 530 , the memory 550 , and I/O 560 .
- the CPU unit 520 may contain cache 522 such as, but not limited to, a level 1 cache, level 2 cache, level 3 cache or combination thereof.
- the aforementioned cache 522 may or may not be shared amongst a plurality of CPU cores 521 .
- the cache 522 sharing comprises at least one of message passing and inter-core communication methods may be used for the at least one CPU Core 521 to communicate with the cache 522 .
- the inter-core communication methods may comprise, but not limited to, bus, ring, two-dimensional mesh, and crossbar.
- the aforementioned CPU unit 520 may employ symmetric multiprocessing (SMP) design.
- SMP symmetric multiprocessing
- the plurality of the aforementioned CPU cores 521 may comprise soft microprocessor cores on a single field programmable gate array (FPGA), such as semiconductor intellectual property cores (IP Core).
- FPGA field programmable gate array
- IP Core semiconductor intellectual property cores
- the plurality of CPU cores 521 architecture may be based on at least one of, but not limited to, Complex instruction set computing (CISC), Zero instruction set computing (ZISC), and Reduced instruction set computing (RISC).
- At least one of the performance-enhancing methods may be employed by the plurality of the CPU cores 521 , for example, but not limited to Instruction-level parallelism (ILP) such as, but not limited to, superscalar pipelining, and Thread-level parallelism (TLP).
- IRP Instruction-level parallelism
- TLP Thread-level parallelism
- the aforementioned computing device 500 may employ a communication system that transfers data between components inside the aforementioned computing device 500 , and/or the plurality of computing devices 500 .
- the aforementioned communication system will be known to a person having ordinary skill in the art as a bus 530 .
- the bus 530 may embody internal and/or external plurality of hardware and software components, for example, but not limited to a wire, optical fiber, communication protocols, and any physical arrangement that provides the same logical function as a parallel electrical bus.
- the bus 530 may comprise at least one of, but not limited to a parallel bus, wherein the parallel bus carry data words in parallel on multiple wires, and a serial bus, wherein the serial bus carry data in bit-serial form.
- the bus 530 may embody a plurality of topologies, for example, but not limited to, a multidrop/electrical parallel topology, a daisy chain topology, and a connected by switched hubs, such as USB bus.
- the bus 530 may comprise a plurality of embodiments, for example, but not limited to:
- the aforementioned computing device 500 may employ hardware integrated circuits that store information for immediate use in the computing device 500 , know to the person having ordinary skill in the art as primary storage or memory 550 .
- the memory 550 operates at high speed, distinguishing it from the non-volatile storage sub-module 561 , which may be referred to as secondary or tertiary storage, which provides slow-to-access information but offers higher capacities at lower cost.
- the contents contained in memory 550 may be transferred to secondary storage via techniques such as, but not limited to, virtual memory and swap.
- the memory 550 may be associated with addressable semiconductor memory, such as integrated circuits consisting of silicon-based transistors, used for example as primary storage but also other purposes in the computing device 500 .
- the memory 550 may comprise a plurality of embodiments, such as, but not limited to volatile memory, non-volatile memory, and semi-volatile memory. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting examples of the aforementioned memory:
- the aforementioned computing device 500 may employ the communication sub-module 562 as a subset of the I/O 560 , which may be referred to by a person having ordinary skill in the art as at least one of, but not limited to, computer network, data network, and network.
- the network allows computing devices 500 to exchange data using connections, which may be known to a person having ordinary skill in the art as data links, between network nodes.
- the nodes comprise network computer devices 500 that originate, route, and terminate data.
- the nodes are identified by network addresses and can include a plurality of hosts consistent with the embodiments of a computing device 500 .
- the aforementioned embodiments include, but not limited to personal computers, phones, servers, drones, and networking devices such as, but not limited to, hubs, switches, routers, modems, and firewalls.
- the communication sub-module 562 supports a plurality of applications and services, such as, but not limited to World Wide Web (WWW), digital video and audio, shared use of application and storage computing devices 500 , printers/scanners/fax machines, email/online chat/instant messaging, remote control, distributed computing, etc.
- the network may comprise a plurality of transmission mediums, such as, but not limited to conductive wire, fiber optics, and wireless.
- the network may comprise a plurality of communications protocols to organize network traffic, wherein application-specific communications protocols are layered, may be known to a person having ordinary skill in the art as carried as payload, over other more general communications protocols.
- the plurality of communications protocols may comprise, but not limited to, IEEE 802, ethernet, Wireless LAN (WLAN/Wi-Fi), Internet Protocol (IP) suite (e.g., TCP/IP, UDP, Internet Protocol version 5 [IPv5], and Internet Protocol version 6 [IPv6]), Synchronous Optical Networking (SONET)/Synchronous Digital Hierarchy (SDH), Asynchronous Transfer Mode (ATM), and cellular standards (e.g., Global System for Mobile Communications [GSM], General Packet Radio Service [GPRS], Code-Division Multiple Access [CDMA], and Integrated Digital Enhanced Network [IDEN]).
- GSM Global System for Mobile Communications
- GPRS General Packet Radio Service
- CDMA Code-Division Multiple Access
- IDEN Integrated Digital Enhanced
- the communication sub-module 562 may comprise a plurality of size, topology, traffic control mechanism and organizational intent.
- the communication sub-module 562 may comprise a plurality of embodiments, such as, but not limited to:
- the aforementioned network may comprise a plurality of layouts, such as, but not limited to, bus network such as ethernet, star network such as Wi-Fi, ring network, mesh network, fully connected network, and tree network.
- the network can be characterized by its physical capacity or its organizational purpose. Use of the network, including user authorization and access rights, differ accordingly.
- the characterization may include, but not limited to nanoscale network, Personal Area Network (PAN), Local Area Network (LAN), Home Area Network (HAN), Storage Area Network (SAN), Campus Area Network (CAN), backbone network, Metropolitan Area Network (MAN), Wide Area Network (WAN), enterprise private network, Virtual Private Network (VPN), and Global Area Network (GAN).
- PAN Personal Area Network
- LAN Local Area Network
- HAN Home Area Network
- SAN Storage Area Network
- CAN Campus Area Network
- backbone network Metropolitan Area Network
- MAN Metropolitan Area Network
- WAN Wide Area Network
- VPN Virtual Private Network
- GAN Global Area Network
- the aforementioned computing device 500 may employ the sensors sub-module 563 as a subset of the I/O 560 .
- the sensors sub-module 563 comprises at least one of the devices, modules, and subsystems whose purpose is to detect events or changes in its environment and send the information to the computing device 500 . Sensors are sensitive to the measured property, are not sensitive to any property not measured, but may be encountered in its application, and do not significantly influence the measured property.
- the sensors sub-module 563 may comprise a plurality of digital devices and analog devices, wherein if an analog device is used, an Analog to Digital (A-to-D) converter must be employed to interface the said device with the computing device 500 .
- A-to-D Analog to Digital
- the sensors may be subject to a plurality of deviations that limit sensor accuracy.
- the sensors sub-module 563 may comprise a plurality of embodiments, such as, but not limited to, chemical sensors, automotive sensors, acoustic/sound/vibration sensors, electric current/electric potential/magnetic/radio sensors, environmental/weather/moisture/humidity sensors, flow/fluid velocity sensors, ionizing radiation/particle sensors, navigation sensors, position/angle/displacement/distance/speed/acceleration sensors, imaging/optical/light sensors, pressure sensors, force/density/level sensors, thermal/temperature sensors, and proximity/presence sensors. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting examples of the aforementioned sensors:
- Chemical sensors such as, but not limited to, breathalyzer, carbon dioxide sensor, carbon monoxide/smoke detector, catalytic bead sensor, chemical field-effect transistor, chemiresistor, electrochemical gas sensor, electronic nose, electrolyte-insulator-semiconductor sensor, energy-dispersive X-ray spectroscopy, fluorescent chloride sensors, holographic sensor, hydrocarbon dew point analyzer, hydrogen sensor, hydrogen sulfide sensor, infrared point sensor, ion-selective electrode, nondispersive infrared sensor, microwave chemistry sensor, nitrogen oxide sensor, olfactometer, optode, oxygen sensor, ozone monitor, pellistor, pH glass electrode, potentiometric sensor, redox electrode, zinc oxide nanorod sensor, and biosensors (such as nano-sensors).
- breathalyzer carbon dioxide sensor
- carbon monoxide/smoke detector catalytic bead sensor
- chemical field-effect transistor chemiresistor
- Automotive sensors such as, but not limited to, air flow meter/mass airflow sensor, air-fuel ratio meter, AFR sensor, blind spot monitor, engine coolant/exhaust gas/cylinder head/transmission fluid temperature sensor, hall effect sensor, wheel/automatic transmission/turbine/vehicle speed sensor, airbag sensors, brake fluid/engine crankcase/fuel/oil/tire pressure sensor, camshaft/crankshaft/throttle position sensor, fuel/oil level sensor, knock sensor, light sensor, MAP sensor, oxygen sensor (o2), parking sensor, radar sensor, torque sensor, variable reluctance sensor, and water-in-fuel sensor.
- air flow meter/mass airflow sensor such as, but not limited to, air flow meter/mass airflow sensor, air-fuel ratio meter, AFR sensor, blind spot monitor, engine coolant/exhaust gas/cylinder head/transmission fluid temperature sensor, hall effect sensor, wheel/automatic transmission/turbine/vehicle speed sensor, airbag sensors, brake fluid/engine crankcase/fuel/o
- the aforementioned computing device 500 may employ the peripherals sub-module 562 as a subset of the I/O 560 .
- the peripheral sub-module 565 comprises ancillary devices uses to put information into and get information out of the computing device 500 .
- There are 3 categories of devices comprising the peripheral sub-module 565 which exist based on their relationship with the computing device 500 , input devices, output devices, and input/output devices.
- Input devices send at least one of data and instructions to the computing device 500 .
- Input devices can be categorized based on, but not limited to:
- Output devices provide output from the computing device 500 .
- Output devices convert electronically generated information into a form that can be presented to humans. Input/output devices perform that perform both input and output functions. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting embodiments of the aforementioned peripheral sub-module 565 :
- Printers such as, but not limited to, inkjet printers, laser printers, 3D printers, solid ink printers and plotters.
- Input/Output Devices may further comprise, but not be limited to, touchscreens, networking device (e.g., devices disclosed in network 562 sub-module), data storage device (non-volatile storage 561 ), facsimile (FAX), and graphics/sound cards.
- networking device e.g., devices disclosed in network 562 sub-module
- data storage device non-volatile storage 561
- facsimile (FAX) facsimile
- graphics/sound cards graphics/sound cards.
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Educational Administration (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Data Mining & Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A system for an automated matching of employment/job engagement candidates to an employer including a processor of a matching server node configured to host a machine learning (ML) module and a memory on which are stored machine-readable instructions that when executed by the processor, cause the processor to: receive employment request data from the employer entity node; parse the employment request data to derive a plurality of features; query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features; generate at least one feature vector based on the plurality of features and the historical candidate-related data; and provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
Description
- The present disclosure generally relates to determination and placement of qualified employment candidates, and more particularly, to an AI-based automated system for real-time matching of candidates with employment requirements based on predictive analytics of job candidate-related historical heuristic data.
- The process of finding and hiring employees to fill an open position can be inefficient for the employer. This process becomes even more inefficient when emergency and medical personnel need to be employed to a site of an emergency situation such as a disaster recovery. Employers are often tasked with finding qualified personnel to fill a temporary, contract, and/or full-time position(s) using online job boards, or more traditional means. During the hiring process, various tasks exist including gathering resumes, vetting candidates for relevant skills and experience, and determining if the applicant has appropriate licenses, etc.
- The current one-size fits all model that is reliant on employers having the expense of using traditional job boards, recruiters or locums staffing agencies to source the right on-demand gig job contract worker candidate makes current processes exceptionally onerous, inefficient, and expensive. Using traditional staffing or locums staffing agencies as middlemen to quickly find skilled labor and entering into long-term contracts, drives overall labor cost higher for employers and may not meet the time constraints in cases of emergency situation such temporary staffing for an on-site disaster recovery. Clearly, conventional manual sifting through various descriptive skills, knowledge and experience indicators does not fare well when dealing with an emergency situation, such as a mass casualty event or a catastrophe, when the response time is of the essence for sourcing, matching and engaging with a large number of on-demand diverse skilled labor.
- Accordingly, a system and method for automated real-time matching of candidates with employment requirements based on predictive analytics of job candidate-related historical heuristic data are desired.
- This brief overview is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This brief overview is not intended to identify key features or essential features of the claimed subject matter. Nor is this brief overview intended to be used to limit the claimed subject matter's scope.
- One embodiment of the present disclosure provides a system for an automated matching of employment candidates to an employer including a processor of a matching server node configured to host a machine learning (ML) module and a memory on which are stored machine-readable instructions that when executed by the processor, cause the processor to: receive employment request data from the employer entity node; parse the employment request data to derive a plurality of features; query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features; generate at least one feature vector based on the plurality of features and the historical candidate-related data; and provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- Another embodiment of the present disclosure provides a method that includes one or more of: receiving employment request data from the employer entity node; parsing the employment request data to derive a plurality of features; querying a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features; generating at least one feature vector based on the plurality of features and the historical candidate-related data; and providing the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node.
- Another embodiment of the present disclosure provides a computer-readable medium including instructions for receiving employment request data from the employer entity node; parsing the employment request data to derive a plurality of features; querying matched candidates database to retrieve historical candidate-related data collected from previous job engagements based on the plurality of features; generating at least one feature vector based on the plurality of features and the historical candidate-related data; and providing the at least one feature vector to the ML module for generating a predictive model configured to produce at least one job engagement parameter for generation of an employment-related notification to the at least one candidate entity node.
- Both the foregoing brief overview and the following detailed description provide examples and are explanatory only. Accordingly, the foregoing brief overview and the following detailed description should not be considered to be restrictive. Further, features or variations may be provided in addition to those set forth herein. For example, embodiments may be directed to various feature combinations and sub-combinations described in the detailed description.
- The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various embodiments of the present disclosure. The drawings contain representations of various trademarks and copyrights owned by the Applicant. In addition, the drawings may contain other marks owned by third parties and are being used for illustrative purposes only. All rights to various trademarks and copyrights represented herein, except those belonging to their respective owners, are vested in and the property of the Applicant. The Applicant retains and reserves all rights in its trademarks and copyrights included herein, and grants permission to reproduce the material only in connection with reproduction of the granted patent and for no other purpose.
- Furthermore, the drawings may contain text or captions that may explain certain embodiments of the present disclosure. This text is included for illustrative, non-limiting, explanatory purposes of certain embodiments detailed in the present disclosure. In the drawings:
-
FIG. 1A illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates, consistent with the present disclosure; -
FIG. 1B illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates and receiving employment consensus over a blockchain consistent with the present disclosure; -
FIG. 2 illustrates a network diagram of a system including detailed features of a matching server (MS) node consistent with the present disclosure; -
FIG. 3A illustrates a flowchart of a method for an AI-based automated matching of employment candidates to an employment request consistent with the present disclosure; -
FIG. 3B illustrates a further flow chart of a method for the automated matching of the employment candidates to the employment request consistent with the present disclosure; -
FIG. 4 illustrates deployment of a machine learning model for prediction of employment parameters using blockchain assets consistent with the present disclosure; -
FIG. 5 illustrates a block diagram of a system including a computing device for performing the method ofFIGS. 3A and 3B . - As a preliminary matter, it will readily be understood by one having ordinary skill in the relevant art that the present disclosure has broad utility and application. As should be understood, any embodiment may incorporate only one or a plurality of the above-disclosed aspects of the disclosure and may further incorporate only one or a plurality of the above-disclosed features. Furthermore, any embodiment discussed and identified as being “preferred” is considered to be part of a best mode contemplated for carrying out the embodiments of the present disclosure. Other embodiments also may be discussed for additional illustrative purposes in providing a full and enabling disclosure. Moreover, many embodiments, such as adaptations, variations, modifications, and equivalent arrangements, will be implicitly disclosed by the embodiments described herein and fall within the scope of the present disclosure.
- Accordingly, while embodiments are described herein in detail in relation to one or more embodiments, it is to be understood that this disclosure is illustrative and exemplary of the present disclosure and are made merely for the purposes of providing a full and enabling disclosure. The detailed disclosure herein of one or more embodiments is not intended, nor is to be construed, to limit the scope of patent protection afforded in any claim of a patent issuing here from, which scope is to be defined by the claims and the equivalents thereof. It is not intended that the scope of patent protection be defined by reading into any claim a limitation found herein that does not explicitly appear in the claim itself.
- Thus, for example, any sequence(s) and/or temporal order of steps of various processes or methods that are described herein are illustrative and not restrictive. Accordingly, it should be understood that, although steps of various processes or methods may be shown and described as being in a sequence or temporal order, the steps of any such processes or methods are not limited to being carried out in any particular sequence or order, absent an indication otherwise. Indeed, the steps in such processes or methods generally may be carried out in various different sequences and orders while still falling within the scope of the present invention. Accordingly, it is intended that the scope of patent protection is to be defined by the issued claim(s) rather than the description set forth herein.
- Additionally, it is important to note that each term used herein refers to that which an ordinary artisan would understand such term to mean based on the contextual use of such term herein. To the extent that the meaning of a term used herein—as understood by the ordinary artisan based on the contextual use of such term—differs in any way from any particular dictionary definition of such term, it is intended that the meaning of the term as understood by the ordinary artisan should prevail.
- Regarding applicability of 35 U.S.C. § 112, ¶6, no claim element is intended to be read in accordance with this statutory provision unless the explicit phrase “means for” or “step for” is actually used in such claim element, whereupon this statutory provision is intended to apply in the interpretation of such claim element.
- Furthermore, it is important to note that, as used herein, “a” and “an” each generally denotes “at least one,” but does not exclude a plurality unless the contextual use dictates otherwise. When used herein to join a list of items, “or” denotes “at least one of the items,” but does not exclude a plurality of items of the list. Finally, when used herein to join a list of items, “and” denotes “all of the items of the list.”
- The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar elements. While many embodiments of the disclosure may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the disclosure. Instead, the proper scope of the disclosure is defined by the appended claims. The present disclosure contains headers. It should be understood that these headers are used as references and are not to be construed as limiting upon the subjected matter disclosed under the header.
- The present disclosure includes many aspects and features. Moreover, while many aspects and features relate to, and are described in, the context of processing job applicants, embodiments of the present disclosure are not limited to use only in this context.
- The present disclosure provides a system, method and computer-readable medium for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates.
- In one embodiment of the present disclosure, the system provides for AI and machine learning (ML)-generated list of employment parameters to be used for analysis and generation of employment-related notifications. In one embodiment, an automated decision model may be generated to provide for employment parameters associated with employment candidate current status, past employment-related behavior based on candidate's current qualifications (licenses, diplomas, certificates, etc.) and/or special knowledge (e.g., Spanish, sign language, crisis management, risk assessment, DOT examination, etc.) or skills (e.g., joint injection, infusion therapy, translation, suturing, etc.), previous employment, employees' feedback, reports, reviews, social media accounts, etc. The automated notification decision model may use historical employment candidates' data collected at the current locations (i.e., a hospital or other emergency jobs site) or work setting, and at other medical/emergency facilities of the same type located within a certain range from the current location or even located globally. The relevant employment candidate's data may include data related to other employment candidates having the same parameters such as specialty, age, race, gender, preferred employment conditions or locations, etc.
- In one disclosed embodiment, the AI/ML technology may be combined with a blockchain technology for secure use of the employment (job engagement) candidate-related data. The disclosed embodiment may produce a detailed safety or success rates score on the successful employment likelihood for the given employment candidate based on collected candidate's behavioral data. This allows for direct reporting on a trust level of the given employment candidate to the hiring authorities/employing entities (i.e., physicians, hospitals, clinics, emergency services, other patient care organizations, etc.). In one embodiment, the employing entities may be connected to the matching server (MS) node over a blockchain network to achieve a consensus prior to executing a transaction to release the employment candidate or the employment conditions for the candidate based on the employment parameters. The employment parameters may be defined by hashtags. The system utilizes descriptive hashtags to aid in matching candidates with job opportunities for which they are interested in and qualify for based on being on-boarded to the system via a blockchain network.
- In one embodiment, the disclosed system relates to a SaaS platform that matches potential employees with employers. The platform may employ a bidding system that incorporates the use of hashtags by employers and potential employees. Employers and potential employees can use four distinct hashtag categories: language, special skill, special knowledge, and “other.” Further, employers can post jobs or generate employment requests using the unique hashtags and set the amount of pay for each job. Potential employees may be able to set up profiles using the unique hashtags and the price they expect to be paid for completing jobs. The platform then matches potential employees to employers based on the hashtag and pay parameters that are processed through an AI machine-learning module that may automatically choose the candidate that best fit the employment requirements most accurately. The platform also allows for employers to leave the pay for each job blank which allows potential employees to bid on jobs. Employers can receive AI-generated recommendations for candidates based on their hashtags and bid amounts. The matching server may automatically choose the candidate(s) that best fits the employment parameters derived from the initial employment requirements.
- The disclosed embodiments provide an online peer-to-peer job to employee matching platform that all allows employers to be in direct contact with potential employees utilizing a system of hashtags that represent skills, language, special knowledge and “other” to allow the employers to quickly and accurately vet potential employees without the need of reading resumes and negotiating pay. One embodiment is directed to a technology-driven bid/counterbid marketplace process where employers of any size may efficiently and quickly get directly matched with one or any number of vetted, on-demand independent contractors (skilled labor) for a gig job or assignment which requires unique or specific skills, knowledge or level of experience. As part of the AI-based matching, an employer and contractor may select from the pre-populated approved list of descriptive hashtags or create their own unique hashtag to enable customized matching based on a specific criteria or demand. The employer may provide the hashtags as part of the employment request.
- As discussed above, the hashtags may be grouped into four distinct categories: Language, Special Skill, Special Knowledge, and “Other.” When posting a gig on the marketplace and searching for on-demand skilled labor, employers may choose any type or number of hashtags combination to be used in the employment request to get the best desired matches via AI-based recommendations. Contractors, in addition to required identity verification documents, demographics, profession type, license or certificate (if applicable), may add to their personal profile any number of optional hashtags which highlight their unique skill, knowledge and experience. This can be implemented at on-boarding into the SaaS platform that may be implemented based on the underlying blockchain network. The hashtags may be parsed out or generated based on the employment request used as part of the matching algorithm implemented by the AI module. When more than one contractor is matched to a gig post, employer has an option to review profile and pick or choose which matched contractor should get notified or send notification to any number of matched contractors simultaneously. Contractor may be notified of the matched available gig over a blockchain network. Once consensus is received from the contractor(s), the employer is notified and may execute a blockchain transaction for signing the contractor to the assignment or the gig job. When posting a gig job and generating the employment request, the employer has an option to list the opening renumeration bid amount for the gig or leave the amount blank for matched contractor to bid on. Leaving the amount blank ensures that employer does not pay more than the contractor is willing to be paid.
- The disclosed process, advantageously, eliminates the need for agencies, recruiters, and traditional job boards by enabling employer and contractor to be automatically matched directly on a granular level based on the AI-based predictive analysis and recommendations. This process includes transparent pricing mechanism coupled with a secure communications chat channel (implemented over a blockchain network) which supports both parties to set, negotiate, and agree on the price and terms of an engagement with each other using a dynamic bid/counterbid function.
- In a use case example, a major hurricane hits Southern Florida. A facility (especially rural) needs a large number of diverse and skilled labor to assist with the situation. Their current work pool is unavailable due to the hurricane aftermath. The facility may quickly post any number of gigs with any combination of desired descriptive hashtags in addition to location, time and duration needed for the skilled labor, and/or post a price offered to be paid. The facility may simultaneously engage with any number of matched workers, bypassing recruiters, job boards and staffing agencies, to quickly achieve best desired outcome.
- In another use case example, a Spanish-speaking patient has been discharged from the hospital and requires a 1-week home-based antibiotic therapy. A nursing agency needs to quickly find and dispatch a nurse to the patient within 12 hours to fix the IV which stopped working. Rather than sending “any available nurse with an active nursing license” who does not speak Spanish, may have no experience with home-bound patient care or IV therapy, an agency may post a gig on the marketplace requiring a match with an active nurse in the nearby geographical area, and add three descriptive hashtags into the employment request to maximize best outcome (#homecare, #Spanish, #IVinfusion). The matching server receives a recommendation from the AI module and shows three matched nurses who meet the exact hashtag criteria who were already onboarded into the platform. Employer receives price bids for the amount each nurse is willing to perform the gig for. The employer may then review each profile, decides on one, and accepts the bid amount from the nurse because it is at or below the price the employer was willing to pay a contractor.
- The disclosed embodiments streamline for the employer more accurate on-demand gig worker match based on specific requirements, especially during emergency or mass casualty event when a large pool of diverse workers is urgently needed. It saves valuable time sourcing candidates, minimizes administrative hassle, saves money on job posts advertisement, and reduces the overhead labor cost. For contractors, the disclosed system increases renumeration by eliminating middleman (job boards, recruiters, agencies); improves flexibility and control over engagement terms; improves work/life balance; avoids lengthy contracts. By using descriptive hashtags and historical heuristics as a part of the direct matching process, the system reduces the employer's risk of engaging with an unknown individual who may have the right credentials on the surface, but lacks the specific type of skill, knowledge or experience required to successfully complete the job. It enables the contractor to choose to engage for the type of work they most desire, experienced in or enjoy doing.
- As discussed above, the proposed employment candidates' matching system may be, advantageously, used for the following non-limiting use cases: short term gig assignment, emergency or disaster recovery situation on site deployment, long term full-time employment, long term contractual employment, one time consulting assignment, etc.
-
FIG. 1A illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates, consistent with the present disclosure. - Referring to
FIG. 1A , theexample network 100 includes the matching server (MS)node 102 connected to a cloud server node(s) 105 over a network. TheMS node 102 is configured to host an AI/ML module 107. TheMS node 102 may receive employment request data from anemployer entity 111. The employment request data may have hashtags representing the employment parameters. In one embodiment, the employment request data may be processed by theMS node 102 to parse out the hashtags or to generate hashtags from the job description. - The
MS node 102 may query a local candidates' database for the historical local candidates'data 103 associated with the current employment request data. TheMS node 102 may acquire relevant remote candidates'data 106 from a remote database residing on acloud server 105. The remote candidates'data 106 may be collected from other employer entities (e.g., medical facilities). The remote candidates'data 106 may be collected from employment candidates that had the same (or similar) qualifications, age, gender, race, etc. as the local candidates' who are associated with the current employment request data. - The
MS node 102 may generate a feature vector or classifier data based on the employment request data and the collected candidates' data (i.e., pre-storedlocal data 103 and remote data 106). TheMS node 102 may ingest the feature vector data into an AI/ML module 107. The AI/ML module 107 may generate a predictive model(s) 108 based on the feature vector data to predict employment parameters for automatically generating a notification(s) to be provided to employment candidates 113 (e.g., nurses, doctor(s), care providers, etc.). The employment parameters may be further analyzed by theMS node 102 prior to generation of the notification(s). In one embodiment, the employment parameters may be used for adjustment of the hiring schedule based on availability of the selected (i.e., matched) employment candidates. -
FIG. 1B illustrates a network diagram of a system for an AI-based automated matching of employment candidates to an employment request and providing notifications to the selected employment candidates and receiving employment consensus over a blockchain consistent with the present disclosure. - Referring to
FIG. 1B , theexample network 100′ includes the matching server (MS)node 102 connected to a cloud server node(s) 105 over a network. TheMS node 102 is configured to host an AI/ML module 107. TheMS node 102 may receive employment request data from anemployer entity 111. The employment request data may have hashtags representing the employment parameters. In one embodiment, the employment request data may be processed by theMS node 102 to parse out the hashtags or to generate hashtags from the job description. - The
MS node 102 may query a local candidates' database for the historical local candidates'data 103 associated with the current employment request data. TheMS node 102 may acquire relevant remote candidates'data 106 from a remote database residing on acloud server 105. The remote candidates'data 106 may be collected from other employer entities (e.g., medical facilities). The remote candidates'data 106 may be collected from employment candidates that had the same (or similar) qualifications, age, gender, race, etc. as the local candidates' who are associated with the current employment request data. - The
MS node 102 may generate a feature vector or classifier data based on the employment request data and the collected candidates' data (i.e., pre-storedlocal data 103 and remote data 106). TheMS node 102 may ingest the feature vector data into an AI/ML module 107. The AI/ML module 107 may generate a predictive model(s) 108 based on the feature vector data to predict employment parameters for automatically generating a notification(s) to be provided to employment candidates 113 (e.g., nurses, doctor(s), care providers, etc.). The employment parameters may be further analyzed by theMS node 102 prior to generation of the notification(s). In one embodiment, the employment parameters may be used for adjustment of the hiring schedule based on availability of the selected (i.e., matched) employment candidates. - In one embodiment, the
MS node 102 may receive the predicted employment parameters from apermissioned blockchain 110ledger 109 based on a consensus from the employment candidates' (devices) 113 confirming, for example, dates and compensation for employment. Additionally, confidential historical candidate-related information and previous candidate-related employment parameters may also be acquired from thepermissioned blockchain 110. The newly acquired candidate-related data with corresponding predicted employment parameters data may be also recorded on theledger 109 of theblockchain 110 so it can be used as training data for the predictive model(s) 108. In this implementation theMS node 102, thecloud server 105, theemployment candidate devices 113 and employer entities(s) 111 may serve asblockchain 110 peer nodes. In one embodiment, local candidates'data 103 and remote candidates'data 106 may be duplicated on theblockchain ledger 109 for higher security of storage. - The AI/
ML module 107 may generate a predictive model(s) 108 to predict the employment parameters for the employment candidates in response to the specific relevant pre-stored candidates'-related data acquired from theblockchain 110ledger 109. This way, the current employment parameters may be predicted based not only on the current employment request-related data and current candidates'-related data, but also based on the previously collected heuristics and candidates'-related data associated with the given employment request data or current employment request parameters derived from the employment request data. -
FIG. 2 illustrates a network diagram of a system including detailed features of a matching server (MS) node consistent with the present disclosure. - Referring to
FIG. 2 , theexample network 200 includes theMS node 102 connected to employer device(s) to receiveemployment request data 201. TheMS node 102 is configured to host an AI/ML module 107. As discussed above with respect toFIGS. 1A-B , theMS node 102 may receive employment request data provided by the employer entities 101 (FIG. 1A ) and pre-stored candidates' data retrieved from local and remote databases. As discussed above, the pre-stored candidates' data may be retrieved from theledger 109 of theblockchain 110. - The AI/
ML module 107 may generate a predictive model(s) 108 based on the receivedemployment request data 201 and the candidates'-related data provided by theMS node 102. As discussed above, the AI/ML module 107 may provide predictive outputs data in a form of employment parameters for automatic generation of notifications for the employment candidates or for adjusting the employment schedule for the candidates. TheMS node 102 may process the predictive outputs data received from the AI/ML module 107 to generate the notification of a current risk assessment ranking pertaining to a particular matched employment candidate. - In one embodiment, the
MS node 102 may acquire employment request data from the employment entities periodically in order to check if new notifications need to be generated or the hiring schedule needs to be reset. In another embodiment, theMS node 102 may continually monitor candidates'-related data acquired from databases/blockchain ledger and may detect a parameter that deviates from a previous recorded parameter (or from a median reading value) by a margin that exceeds a threshold value pre-set for this particular parameter. For example, if a candidate's declared employment duration changes this may cause a drastic change in this candidate's employment parameters. As another non-limiting example, a significant increase in candidate's desired compensation??? may also cause critical changes in candidate's employment possibilities. Accordingly, once the threshold is met or exceeded by at least one employment parameter of the candidate, theMS node 102 may provide the currently acquired employment candidate parameter to the AI/ML module 107 to generate a list of updated employment parameters based on the current candidate's employment conditions and requirements. - While this example describes in detail only one
MS node 102, multiple such nodes may be connected to the network and to theblockchain 110. It should be understood that theMS node 102 may include additional components and that some of the components described herein may be removed and/or modified without departing from a scope of theMS node 102 disclosed herein. TheMS node 102 may be a computing device or a server computer, or the like, and may include aprocessor 204, which may be a semiconductor-based microprocessor, a central processing unit (CPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or another hardware device. Although asingle processor 204 is depicted, it should be understood that theMS node 102 may include multiple processors, multiple cores, or the like, without departing from the scope of theMS node 102 system. - The
MS node 102 may also include a non-transitory computerreadable medium 212 that may have stored thereon machine-readable instructions executable by theprocessor 204. Examples of the machine-readable instructions are shown as 214-222 and are further discussed below. Examples of the non-transitory computerreadable medium 212 may include an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. For example, the non-transitory computerreadable medium 212 may be a Random-Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a hard disk, an optical disc, or other type of storage device. - The
processor 204 may fetch, decode, and execute the machine-readable instructions 214 to receive employment request data from theemployer entity node 111. Theprocessor 204 may fetch, decode, and execute the machine-readable instructions 216 to parse the employment request data to derive a plurality of features. Theprocessor 204 may fetch, decode, and execute the machine-readable instructions 218 to query alocal candidates database 103 to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features. Theprocessor 204 may fetch, decode, and execute the machine-readable instructions 220 to generate at least one feature vector based on the plurality of features and the historical candidate-related data. - The
processor 204 may fetch, decode, and execute the machine-readable instructions 222 to provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node. Thepermissioned blockchain 110 may be configured to use one or more smart contracts that manage transactions for multiple participating nodes and for recording the transactions on theledger 109. -
FIG. 3A illustrates a flowchart of a method for an AI-based automated matching of employment candidates to an employment request consistent with the present disclosure. - Referring to
FIG. 3A , themethod 300 may include one or more of the steps described below.FIG. 3A illustrates a flow chart of an example method executed by the MS 102 (seeFIG. 2 ). It should be understood thatmethod 300 depicted inFIG. 3A may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scope of themethod 300. The description of themethod 300 is also made with reference to the features depicted inFIG. 2 for purposes of illustration. Particularly, theprocessor 204 of theMS node 102 may execute some or all of the operations included in themethod 300. - With reference to
FIG. 3A , atblock 302, theprocessor 204 may receive employment request data from the employer entity node. Atblock 304, theprocessor 204 may parse the employment request data to derive a plurality of features. Atblock 306, theprocessor 204 may query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features. Atblock 308, theprocessor 204 may generate at least one feature vector based on the plurality of features and the historical candidate-related data. Atblock 310, theprocessor 204 may provide the at least one feature vector to the ML module for generating a predictive model configured to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node. -
FIG. 3B illustrates a further flow chart of a method for the automated matching of the employment candidates to the employment request consistent with the present disclosure. Referring toFIG. 3B , themethod 300′ may include one or more of the steps described below.FIG. 3B illustrates a flow chart of an example method executed by the MS 102 (seeFIG. 2 ). It should be understood thatmethod 300′ depicted inFIG. 3B may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scope of themethod 300′. The description of themethod 300′ is also made with reference to the features depicted inFIG. 2 for purposes of illustration. Particularly, theprocessor 204 of theMS 102 may execute some or all of the operations included in themethod 300′. - With reference to
FIG. 3B , atblock 314, theprocessor 204 may generate at least one rescheduling parameter for resetting a hiring schedule associated with the employment request data based on the at least one employment parameter. Atblock 316, theprocessor 204 may retrieve remote historical candidate-related data from at least one remote candidates' database based on the local historical candidate-related data, wherein the remote historical candidate-related data is collected at locations associated with a plurality of employer entities affiliated with the employer entity node. Atblock 318, theprocessor 204 may generate the at least one feature vector based on the plurality of features, the local historical candidate-related data combined with the remote historical candidate-related data. Atblock 320, theprocessor 204 may parse the employment request data to generate a plurality of hashtags. - At
block 322, theprocessor 204 may generate the plurality of features based on the plurality of hashtags. Atblock 324, theprocessor 204 may continuously monitor incoming employment request data to determine if at least one value of employment request parameters deviates from a previous value of a previous employment request data by a margin exceeding a pre-set threshold value. Atblock 326, theprocessor 204 may, responsive to the at least one value deviating from the previous value by the margin exceeding the pre-set threshold value, generate an updated feature vector based on the incoming employment request data and generate the notification based on the at least one employment parameter produced by the predictive model in response to the updated feature vector. Atblock 328, theprocessor 204 may record the at least one employment parameter on a blockchain ledger along with the employment request data. Atblock 330, theprocessor 204 may retrieve the at least one employment parameter from the blockchain responsive to a consensus among the employer entity node and the at least one candidate entity node. Atblock 332, theprocessor 204 may execute a smart contract to record data reflecting rescheduling of hiring of a candidate associated with the at least one candidate entity node on the blockchain for future audits. - In one disclosed embodiment, the employment parameters' model may be generated by the AI/
ML module 107 that may use training data sets to improve accuracy of the prediction of the employment parameters for the candidate 113 (FIG. 1A ). The employment parameters used in training data sets may be stored in a centralized local database (such as one used for storing local candidates'data 103 depicted inFIG. 1A ). In one embodiment, a neural network may be used in the AI/ML module 107 for employment parameters modeling and hiring rescheduling predictions. - In another embodiment, the AI/
ML module 107 may use a decentralized storage such as a blockchain 110 (seeFIG. 1B ) that is a distributed storage system, which includes multiple nodes that communicate with each other. The decentralized storage includes an append-only immutable data structure resembling a distributed ledger capable of maintaining records between mutually untrusted parties. The untrusted parties are referred to herein as peers or peer nodes. Each peer maintains a copy of the parameter(s) records and no single peer can modify the records without a consensus being reached among the distributed peers. For example, the 101, 113 and 102 (peers FIG. 1B ) may execute a consensus protocol to validateblockchain 110 storage transactions, group the storage transactions into blocks, and build a hash chain over the blocks. This process forms theledger 109 by ordering the storage transactions, as is necessary, for consistency. In various embodiments, a permissioned and/or a permissionless blockchain can be used. In a public or permissionless blockchain, anyone can participate without a specific identity. Public blockchains can involve assets and use consensus based on various protocols such as Proof of Work (PoW). On the other hand, a permissioned blockchain provides secure interactions among a group of entities which share a common goal such as storing alert parameters for efficient monitoring of a patient, but which do not fully trust one another. - This application utilizes a permissioned (private) blockchain that operates arbitrary, programmable logic, tailored to a decentralized storage scheme and referred to as “smart contracts” or “chaincodes.” In some cases, specialized chaincodes may exist for management functions and parameters which are referred to as system chaincodes. The application can further utilize smart contracts that are trusted distributed applications which leverage tamper-proof properties of the blockchain database and an underlying agreement between nodes, which is referred to as an endorsement or endorsement policy. Blockchain transactions associated with this application can be “endorsed” before being committed to the blockchain while transactions, which are not endorsed, are disregarded. An endorsement policy allows chaincodes to specify endorsers for a transaction in the form of a set of peer nodes that are necessary for endorsement. When a client sends the transaction to the peers specified in the endorsement policy, the transaction is executed to validate the transaction. After a validation, the transactions enter an ordering phase in which a consensus protocol is used to produce an ordered sequence of endorsed transactions grouped into blocks.
- In the example depicted in
FIG. 4 , a host platform 420 (such as the MS node 102) builds and deploys a machine learning model for predictive monitoring ofassets 430. Here, thehost platform 420 may be a cloud platform, an industrial server, a web server, a personal computer, a user device, and the like.Assets 430 can represent notifications or candidates' employment parameters. Theblockchain 110 can be used to significantly improve both atraining process 402 of the machine learning model and the employment parameters' predictive process 405 based on a trained machine learning model. For example, in 402, rather than requiring a data scientist/engineer or other user to collect the data, historical data (heuristics—i.e., candidate-related data) may be stored by theassets 430 themselves (or through an intermediary, not shown) on theblockchain 110. - This can significantly reduce the collection time needed by the
host platform 420 when performing predictive model training. For example, using smart contracts, data can be directly and reliably transferred straight from its place of origin (e.g., from the employment entities or from candidates' database) to theblockchain 110. By using theblockchain 110 to ensure the security and ownership of the collected data, smart contracts may directly send the data from the assets to the entities that use the data for building a machine learning model. This allows for sharing of data among theassets 430. The collected data may be stored in theblockchain 110 based on a consensus mechanism. The consensus mechanism pulls in (permissioned nodes) to ensure that the data being recorded is verified and accurate. The data recorded is time-stamped, cryptographically signed, and immutable. It is therefore auditable, transparent, and secure. - Furthermore, training of the machine learning model on the collected data may take rounds of refinement and testing by the
host platform 420. Each round may be based on additional data or data that was not previously considered to help expand the knowledge of the machine learning model. In 402, the different training and testing steps (and the data associated therewith) may be stored on theblockchain 110 by thehost platform 420. Each refinement of the machine learning model (e.g., changes in variables, weights, etc.) may be stored on theblockchain 110. This provides verifiable proof of how the model was trained and what data was used to train the model. Furthermore, when thehost platform 420 has achieved a finally trained model, the resulting model itself may be stored on theblockchain 110. - After the model has been trained, it may be deployed to a live environment where it can make employment-related predictions/decisions based on the execution of the final trained machine learning model using the employment parameters. In this example, data fed back from the
asset 430 may be input into the machine learning model and may be used to make event predictions such as most optimal candidate employment and scheduling parameters for re-setting the employment contracts for the given employment request. Determinations made by the execution of the machine learning model (e.g., notification or rescheduling parameters, etc.) at thehost platform 420 may be stored on theblockchain 110 to provide auditable/verifiable proof. As one non-limiting example, the machine learning model may predict a future change of a part of the asset 430 (the alert parameters—i.e., assessment of risk of employment). The data behind this decision may be stored by thehost platform 420 on theblockchain 110. - As discussed above, in one embodiment, the features and/or the actions described and/or depicted herein can occur on or with respect to the
blockchain 110. The above embodiments of the present disclosure may be implemented in hardware, in a computer-readable instructions executed by a processor, in firmware, or in a combination of the above. The computer computer-readable instructions may be embodied on a computer-readable medium, such as a storage medium. For example, the computer computer-readable instructions may reside in random access memory (“RAM”), flash memory, read-only memory (“ROM”), erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of storage medium known in the art. - An exemplary storage medium may be coupled to the processor such that the processor may read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (“ASIC”). In the alternative embodiment, the processor and the storage medium may reside as discrete components. For example,
FIG. 5 illustrates an example computing device (e.g., a server node) 500, which may represent or be integrated in any of the above-described components, etc. -
FIG. 5 illustrates a block diagram of a system includingcomputing device 500. Thecomputing device 500 may comprise, but not be limited to the following: - Mobile computing device, such as, but is not limited to, a laptop, a tablet, a smartphone, a drone, a wearable, an embedded device, a handheld device, an Arduino, an industrial device, or a remotely operable recording device;
- A supercomputer, an exa-scale supercomputer, a mainframe, or a quantum computer;
- A minicomputer, wherein the minicomputer computing device comprises, but is not limited to, an IBM AS500/iSeries/System I, A DEC VAX/PDP, a HP3000, a Honeywell-Bull DPS, a Texas Instruments TI-990, or a Wang Laboratories VS Series;
- A microcomputer, wherein the microcomputer computing device comprises, but is not limited to, a server, wherein a server may be rack mounted, a workstation, an industrial device, a raspberry pi, a desktop, or an embedded device;
- The MS node 102 (see
FIG. 2 ) may be hosted on a centralized server or on a cloud computing service. Althoughmethod 300 has been described to be performed by theMS node 102 implemented on acomputing device 500, it should be understood that, in some embodiments, different operations may be performed by a plurality of thecomputing devices 500 in operative communication at least one network. - Embodiments of the present disclosure may comprise a computing device having a central processing unit (CPU) 520, a
bus 530, amemory unit 550, a power supply unit (PSU) 550, and one or more Input/Output (I/O) units. TheCPU 520 coupled to thememory unit 550 and the plurality of I/O units 560 via thebus 530, all of which are powered by thePSU 550. It should be understood that, in some embodiments, each disclosed unit may actually be a plurality of such units for the purposes of redundancy, high availability, and/or performance. The combination of the presently disclosed units is configured to perform the stages any method disclosed herein. - Consistent with an embodiment of the disclosure, the
aforementioned CPU 520, thebus 530, thememory unit 550, aPSU 550, and the plurality of I/O units 560 may be implemented in a computing device, such ascomputing device 500. Any suitable combination of hardware, software, or firmware may be used to implement the aforementioned units. For example, theCPU 520, thebus 530, and thememory unit 550 may be implemented withcomputing device 500 or any ofother computing devices 500, in combination withcomputing device 500. The aforementioned system, device, and components are examples and other systems, devices, and components may comprise theaforementioned CPU 520, thebus 530, thememory unit 550, consistent with embodiments of the disclosure. - At least one
computing device 500 may be embodied as any of the computing elements illustrated in all of the attached figures, including the design server node 102 (FIG. 2 ). Acomputing device 500 does not need to be electronic, nor even have aCPU 520, norbus 530, normemory unit 550. The definition of thecomputing device 500 to a person having ordinary skill in the art is “A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.” Any device which processes information qualifies as acomputing device 500, especially if the processing is purposeful. - With reference to
FIG. 5 , a system consistent with an embodiment of the disclosure may include a computing device, such ascomputing device 500. In a basic configuration,computing device 500 may include at least oneclock module 510, at least oneCPU 520, at least onebus 530, and at least onememory unit 550, at least onePSU 550, and at least one I/O 560 module, wherein I/O module may be comprised of, but not limited to anon-volatile storage sub-module 561, acommunication sub-module 562, a sensors sub-module 563, and a peripherals sub-module 565. - A system consistent with an embodiment of the disclosure the
computing device 500 may include theclock module 510 may be known to a person having ordinary skill in the art as a clock generator, which produces clock signals. Clock signal is a particular type of signal that oscillates between a high and a low state and is used like a metronome to coordinate actions of digital circuits. Most integrated circuits (ICs) of sufficient complexity use a clock signal in order to synchronize different parts of the circuit, cycling at a rate slower than the worst-case internal propagation delays. The preeminent example of the aforementioned integrated circuit is theCPU 520, the central component of modern computers, which relies on a clock. The only exceptions are asynchronous circuits such as asynchronous CPUs. Theclock 510 can comprise a plurality of embodiments, such as, but not limited to, single-phase clock which transmits all clock signals on effectively 1 wire, two-phase clock which distributes clock signals on two wires, each with non-overlapping pulses, and four-phase clock which distributes clock Signals on 5 wires. -
Many computing devices 500 use a “clock multiplier” which multiplies a lower frequency external clock to the appropriate clock rate of theCPU 520. This allows theCPU 520 to operate at a much higher frequency than the rest of the computer, which affords performance gains in situations where theCPU 520 does not need to wait on an external factor (likememory 550 or input/output 560). Some embodiments of theclock 510 may include dynamic frequency change, where, the time between clock edges can vary widely from one edge to the next and back again. - A system consistent with an embodiment of the disclosure the
computing device 500 may include theCPU unit 520 comprising at least oneCPU Core 521. A plurality ofCPU cores 521 may compriseidentical CPU cores 521, such as, but not limited to, homogeneous multi-core systems. It is also possible for the plurality ofCPU cores 521 to comprisedifferent CPU cores 521, such as, but not limited to, heterogeneous multi-core systems, big.LITTLE systems and some AMD accelerated processing units (APU). TheCPU unit 520 reads and executes program instructions which may be used across many application domains, for example, but not limited to, general purpose computing, embedded computing, network computing, digital signal processing (DSP), and graphics processing (GPU). TheCPU unit 520 may run multiple instructions onseparate CPU cores 521 at the same time. TheCPU unit 520 may be integrated into at least one of a single integrated circuit die and multiple dies in a single chip package. The single integrated circuit die and multiple dies in a single chip package may contain a plurality of other aspects of thecomputing device 500, for example, but not limited to, theclock 510, theCPU 520, thebus 530, thememory 550, and I/O 560. - The
CPU unit 520 may contain cache 522 such as, but not limited to, alevel 1 cache, level 2 cache, level 3 cache or combination thereof. The aforementioned cache 522 may or may not be shared amongst a plurality ofCPU cores 521. The cache 522 sharing comprises at least one of message passing and inter-core communication methods may be used for the at least oneCPU Core 521 to communicate with the cache 522. The inter-core communication methods may comprise, but not limited to, bus, ring, two-dimensional mesh, and crossbar. Theaforementioned CPU unit 520 may employ symmetric multiprocessing (SMP) design. - The plurality of the
aforementioned CPU cores 521 may comprise soft microprocessor cores on a single field programmable gate array (FPGA), such as semiconductor intellectual property cores (IP Core). The plurality ofCPU cores 521 architecture may be based on at least one of, but not limited to, Complex instruction set computing (CISC), Zero instruction set computing (ZISC), and Reduced instruction set computing (RISC). At least one of the performance-enhancing methods may be employed by the plurality of theCPU cores 521, for example, but not limited to Instruction-level parallelism (ILP) such as, but not limited to, superscalar pipelining, and Thread-level parallelism (TLP). - Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ a communication system that transfers data between components inside theaforementioned computing device 500, and/or the plurality ofcomputing devices 500. The aforementioned communication system will be known to a person having ordinary skill in the art as abus 530. Thebus 530 may embody internal and/or external plurality of hardware and software components, for example, but not limited to a wire, optical fiber, communication protocols, and any physical arrangement that provides the same logical function as a parallel electrical bus. Thebus 530 may comprise at least one of, but not limited to a parallel bus, wherein the parallel bus carry data words in parallel on multiple wires, and a serial bus, wherein the serial bus carry data in bit-serial form. Thebus 530 may embody a plurality of topologies, for example, but not limited to, a multidrop/electrical parallel topology, a daisy chain topology, and a connected by switched hubs, such as USB bus. Thebus 530 may comprise a plurality of embodiments, for example, but not limited to: -
- Internal data bus (data bus) 531/Memory bus
-
Control bus 532 -
Address bus 533 - System Management Bus (SMBus)
- Front-Side-Bus (FSB)
- External Bus Interface (EBI)
- Local bus
- Expansion bus
- Lightning bus
- Controller Area Network (CAN bus)
- Camera Link
- ExpressCard
- Advanced Technology management Attachment (ATA), including embodiments and derivatives such as, but not limited to, Integrated Drive Electronics (IDE)/Enhanced IDE (EIDE), ATA Packet Interface (ATAPI), Ultra-Direct Memory Access (UDMA), Ultra ATA (UATA)/Parallel ATA (PATA)/Serial ATA (SATA), CompactFlash (CF) interface, Consumer Electronics ATA (CE-ATA)/Fiber Attached Technology Adapted (FATA), Advanced Host Controller Interface (AHCI), SATA Express (SATAe)/External SATA (eSATA), including the powered embodiment eSATAp/Mini-SATA (mSATA), and Next Generation Form Factor (NGFF)/M.2.
- Small Computer System Interface (SCSI)/Serial Attached SCSI (SAS)
- HyperTransport
- InfiniBand
- RapidIO
- Mobile Industry Processor Interface (MIPI)
- Coherent Processor Interface (CAPI)
- Plug-n-play
- 1-Wire
- Peripheral Component Interconnect (PCI), including embodiments such as, but not limited to, Accelerated Graphics Port (AGP), Peripheral Component Interconnect extended (PCI-X), Peripheral Component Interconnect Express (PCI-e) (e.g., PCI Express Mini Card, PCI Express M.2 [Mini PCIe v2], PCI Express External Cabling [ePCIe], and PCI Express OCuLink [Optical Copper {Cu} Link]), Express Card, AdvancedTCA, AMC, Universal IO, Thunderbolt/Mini DisplayPort, Mobile PCIe (M-PCIe), U.2, and Non-Volatile Memory Express (NVMe)/Non-Volatile Memory Host Controller Interface Specification (NVMHCIS).
- Industry Standard Architecture (ISA), including embodiments such as, but not limited to Extended ISA (EISA), PC/XT-bus/PC/AT-bus/PC/105 bus (e.g., PC/105-Plus, PCI/105-Express, PCI/105, and PCI-105), and Low Pin Count (LPC).
- Music Instrument Digital Interface (MIDI)
- Universal Serial Bus (USB), including embodiments such as, but not limited to, Media Transfer Protocol (MTP)/Mobile High-Definition Link (MHL), Device Firmware Upgrade (DFU), wireless USB, InterChip USB, IEEE 1395 Interface/Firewire, Thunderbolt, and extensible Host Controller Interface (xHCI).
- Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ hardware integrated circuits that store information for immediate use in thecomputing device 500, know to the person having ordinary skill in the art as primary storage ormemory 550. Thememory 550 operates at high speed, distinguishing it from thenon-volatile storage sub-module 561, which may be referred to as secondary or tertiary storage, which provides slow-to-access information but offers higher capacities at lower cost. The contents contained inmemory 550, may be transferred to secondary storage via techniques such as, but not limited to, virtual memory and swap. Thememory 550 may be associated with addressable semiconductor memory, such as integrated circuits consisting of silicon-based transistors, used for example as primary storage but also other purposes in thecomputing device 500. Thememory 550 may comprise a plurality of embodiments, such as, but not limited to volatile memory, non-volatile memory, and semi-volatile memory. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting examples of the aforementioned memory: -
- Volatile memory which requires power to maintain stored information, for example, but not limited to, Dynamic Random-Access Memory (DRAM) 551, Static Random-Access Memory (SRAM) 552,
CPU Cache memory 525, Advanced Random-Access Memory (A-RAM), and other types of primary storage such as Random-Access Memory (RAM). - Non-volatile memory which can retain stored information even after power is removed, for example, but not limited to, Read-Only Memory (ROM) 553, Programmable ROM (PROM) 555, Erasable PROM (EPROM) 555, Electrically Erasable PROM (EEPROM) 556 (e.g., flash memory and Electrically Alterable PROM [EAPROM]), Mask ROM (MROM), One Time Programmable (OTP) ROM/Write Once Read Many (WORM), Ferroelectric RAM (FeRAM), Parallel Random-Access Machine (PRAM), Split-Transfer Torque RAM (STT-RAM), Silicon Oxime Nitride Oxide Silicon (SONOS), Resistive RAM (RRAM), Nano RAM (NRAM), 3D XPoint, Domain-Wall Memory (DWM), and millipede memory.
- Semi-volatile memory which may have some limited non-volatile duration after power is removed but loses data after said duration has passed. Semi-volatile memory provides high performance, durability, and other valuable characteristics typically associated with volatile memory, while providing some benefits of true non-volatile memory. The semi-volatile memory may comprise volatile and non-volatile memory and/or volatile memory with battery to provide power after power is removed. The semi-volatile memory may comprise, but not limited to spin-transfer torque RAM (STT-RAM).
- Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ the communication system between an information processing system, such as thecomputing device 500, and the outside world, for example, but not limited to, human, environment, and anothercomputing device 500. The aforementioned communication system will be known to a person having ordinary skill in the art as I/O 560. The I/O module 560 regulates a plurality of inputs and outputs with regard to thecomputing device 500, wherein the inputs are a plurality of signals and data received by thecomputing device 500, and the outputs are the plurality of signals and data sent from thecomputing device 500. The I/O module 560 interfaces a plurality of hardware, such as, but not limited to,non-volatile storage 561,communication devices 562,sensors 563, and peripherals 565. The plurality of hardware is used by the at least one of, but not limited to, human, environment, and anothercomputing device 500 to communicate with thepresent computing device 500. The I/O module 560 may comprise a plurality of forms, for example, but not limited to channel I/O, port mapped I/O, asynchronous I/O, and Direct Memory Access (DMA). - Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ thenon-volatile storage sub-module 561, which may be referred to by a person having ordinary skill in the art as one of secondary storage, external memory, tertiary storage, off-line storage, and auxiliary storage. Thenon-volatile storage sub-module 561 may not be accessed directly by theCPU 520 without using intermediate area in thememory 550. Thenon-volatile storage sub-module 561 does not lose data when power is removed and may be two orders of magnitude less costly than storage used in memory module, at the expense of speed and latency. Thenon-volatile storage sub-module 561 may comprise a plurality of forms, such as, but not limited to, Direct Attached Storage (DAS), Network Attached Storage (NAS), Storage Area Network (SAN), nearline storage, Massive Array of Idle Disks (MAID), Redundant Array of Independent Disks (RAID), device mirroring, off-line storage, and robotic storage. The non-volatile storage sub-module (561) may comprise a plurality of embodiments, such as, but not limited to: - Optical storage, for example, but not limited to, Compact Disk (CD) (CD-ROM/CD-R/CD-RW), Digital Versatile Disk (DVD) (DVD-ROM/DVD-R/DVD+R/DVD-RW/DVD+RW/DVD+RW/DVD+R DL/DVD-RAM/HD-DVD), Blu-ray Disk (BD) (BD-ROM/BD-R/BD-RE/BD-R DL/BD-RE DL), and Ultra-Density Optical (UDO).
- Semiconductor storage, for example, but not limited to, flash memory, such as, but not limited to, USB flash drive, Memory card, Subscriber Identity Module (SIM) card, Secure Digital (SD) card, Smart Card, CompactFlash (CF) card, Solid-State Drive (SSD) and memristor.
- Magnetic storage such as, but not limited to, Hard Disk Drive (HDD), tape drive, carousel memory, and Card Random-Access Memory (CRAM).
- Phase-change memory
- Holographic data storage such as Holographic Versatile Disk (HVD).
- Molecular Memory
- Deoxyribonucleic Acid (DNA) digital data storage
- Volatile memory which requires power to maintain stored information, for example, but not limited to, Dynamic Random-Access Memory (DRAM) 551, Static Random-Access Memory (SRAM) 552,
- Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ thecommunication sub-module 562 as a subset of the I/O 560, which may be referred to by a person having ordinary skill in the art as at least one of, but not limited to, computer network, data network, and network. The network allowscomputing devices 500 to exchange data using connections, which may be known to a person having ordinary skill in the art as data links, between network nodes. The nodes comprisenetwork computer devices 500 that originate, route, and terminate data. The nodes are identified by network addresses and can include a plurality of hosts consistent with the embodiments of acomputing device 500. The aforementioned embodiments include, but not limited to personal computers, phones, servers, drones, and networking devices such as, but not limited to, hubs, switches, routers, modems, and firewalls. - Two nodes can be said are networked together, when one
computing device 500 is able to exchange information with theother computing device 500, whether or not they have a direct connection with each other. Thecommunication sub-module 562 supports a plurality of applications and services, such as, but not limited to World Wide Web (WWW), digital video and audio, shared use of application andstorage computing devices 500, printers/scanners/fax machines, email/online chat/instant messaging, remote control, distributed computing, etc. The network may comprise a plurality of transmission mediums, such as, but not limited to conductive wire, fiber optics, and wireless. The network may comprise a plurality of communications protocols to organize network traffic, wherein application-specific communications protocols are layered, may be known to a person having ordinary skill in the art as carried as payload, over other more general communications protocols. The plurality of communications protocols may comprise, but not limited to, IEEE 802, ethernet, Wireless LAN (WLAN/Wi-Fi), Internet Protocol (IP) suite (e.g., TCP/IP, UDP, Internet Protocol version 5 [IPv5], and Internet Protocol version 6 [IPv6]), Synchronous Optical Networking (SONET)/Synchronous Digital Hierarchy (SDH), Asynchronous Transfer Mode (ATM), and cellular standards (e.g., Global System for Mobile Communications [GSM], General Packet Radio Service [GPRS], Code-Division Multiple Access [CDMA], and Integrated Digital Enhanced Network [IDEN]). - The
communication sub-module 562 may comprise a plurality of size, topology, traffic control mechanism and organizational intent. Thecommunication sub-module 562 may comprise a plurality of embodiments, such as, but not limited to: -
- Wired communications, such as, but not limited to, coaxial cable, phone lines, twisted pair cables (ethernet), and InfiniBand.
- Wireless communications, such as, but not limited to, communications satellites, cellular systems, radio frequency/spread spectrum technologies, IEEE 802.11 Wi-Fi, Bluetooth, NFC, free-space optical communications, terrestrial microwave, and Infrared (IR) communications. Wherein cellular systems embody technologies such as, but not limited to, 3G, 5G (such as WiMax and LTE), and 5G (short and long wavelength).
- Parallel communications, such as, but not limited to, LPT ports.
- Serial communications, such as, but not limited to, RS-232 and USB.
- Fiber Optic communications, such as, but not limited to, Single-mode optical fiber (SMF) and Multi-mode optical fiber (MMF).
- Power Line and wireless communications
- The aforementioned network may comprise a plurality of layouts, such as, but not limited to, bus network such as ethernet, star network such as Wi-Fi, ring network, mesh network, fully connected network, and tree network. The network can be characterized by its physical capacity or its organizational purpose. Use of the network, including user authorization and access rights, differ accordingly. The characterization may include, but not limited to nanoscale network, Personal Area Network (PAN), Local Area Network (LAN), Home Area Network (HAN), Storage Area Network (SAN), Campus Area Network (CAN), backbone network, Metropolitan Area Network (MAN), Wide Area Network (WAN), enterprise private network, Virtual Private Network (VPN), and Global Area Network (GAN).
- Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ the sensors sub-module 563 as a subset of the I/O 560. The sensors sub-module 563 comprises at least one of the devices, modules, and subsystems whose purpose is to detect events or changes in its environment and send the information to thecomputing device 500. Sensors are sensitive to the measured property, are not sensitive to any property not measured, but may be encountered in its application, and do not significantly influence the measured property. The sensors sub-module 563 may comprise a plurality of digital devices and analog devices, wherein if an analog device is used, an Analog to Digital (A-to-D) converter must be employed to interface the said device with thecomputing device 500. The sensors may be subject to a plurality of deviations that limit sensor accuracy. The sensors sub-module 563 may comprise a plurality of embodiments, such as, but not limited to, chemical sensors, automotive sensors, acoustic/sound/vibration sensors, electric current/electric potential/magnetic/radio sensors, environmental/weather/moisture/humidity sensors, flow/fluid velocity sensors, ionizing radiation/particle sensors, navigation sensors, position/angle/displacement/distance/speed/acceleration sensors, imaging/optical/light sensors, pressure sensors, force/density/level sensors, thermal/temperature sensors, and proximity/presence sensors. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting examples of the aforementioned sensors: - Chemical sensors, such as, but not limited to, breathalyzer, carbon dioxide sensor, carbon monoxide/smoke detector, catalytic bead sensor, chemical field-effect transistor, chemiresistor, electrochemical gas sensor, electronic nose, electrolyte-insulator-semiconductor sensor, energy-dispersive X-ray spectroscopy, fluorescent chloride sensors, holographic sensor, hydrocarbon dew point analyzer, hydrogen sensor, hydrogen sulfide sensor, infrared point sensor, ion-selective electrode, nondispersive infrared sensor, microwave chemistry sensor, nitrogen oxide sensor, olfactometer, optode, oxygen sensor, ozone monitor, pellistor, pH glass electrode, potentiometric sensor, redox electrode, zinc oxide nanorod sensor, and biosensors (such as nano-sensors).
- Automotive sensors, such as, but not limited to, air flow meter/mass airflow sensor, air-fuel ratio meter, AFR sensor, blind spot monitor, engine coolant/exhaust gas/cylinder head/transmission fluid temperature sensor, hall effect sensor, wheel/automatic transmission/turbine/vehicle speed sensor, airbag sensors, brake fluid/engine crankcase/fuel/oil/tire pressure sensor, camshaft/crankshaft/throttle position sensor, fuel/oil level sensor, knock sensor, light sensor, MAP sensor, oxygen sensor (o2), parking sensor, radar sensor, torque sensor, variable reluctance sensor, and water-in-fuel sensor.
-
- Acoustic, sound and vibration sensors, such as, but not limited to, microphone, lace sensor (guitar pickup), seismometer, sound locator, geophone, and hydrophone.
- Electric current, electric potential, magnetic, and radio sensors, such as, but not limited to, current sensor, Daly detector, electroscope, electron multiplier, faraday cup, galvanometer, hall effect sensor, hall probe, magnetic anomaly detector, magnetometer, magnetoresistance, MEMS magnetic field sensor, metal detector, planar hall sensor, radio direction finder, and voltage detector.
- Environmental, weather, moisture, and humidity sensors, such as, but not limited to, actinometer, air pollution sensor, bedwetting alarm, ceilometer, dew warning, electrochemical gas sensor, fish counter, frequency domain sensor, gas detector, hook gauge evaporimeter, humistor, hygrometer, leaf sensor, lysimeter, pyranometer, pyrgeometer, psychrometer, rain gauge, rain sensor, seismometers, SNOTEL, snow gauge, soil moisture sensor, stream gauge, and tide gauge.
- Flow and fluid velocity sensors, such as, but not limited to, air flow meter, anemometer, flow sensor, gas meter, mass flow sensor, and water meter.
- Ionizing radiation and particle sensors, such as, but not limited to, cloud chamber, Geiger counter, Geiger-Muller tube, ionization chamber, neutron detection, proportional counter, scintillation counter, semiconductor detector, and thermoluminescent dosimeter.
- Navigation sensors, such as, but not limited to, air speed indicator, altimeter, attitude indicator, depth gauge, fluxgate compass, gyroscope, inertial navigation system, inertial reference unit, magnetic compass, MHD sensor, ring laser gyroscope, turn coordinator, variometer, vibrating structure gyroscope, and yaw rate sensor.
- Position, angle, displacement, distance, speed, and acceleration sensors, such as, but not limited to, accelerometer, displacement sensor, flex sensor, free fall sensor, gravimeter, impact sensor, laser rangefinder, LIDAR, odometer, photoelectric sensor, position sensor such as, but not limited to, GPS or Glonass, angular rate sensor, shock detector, ultrasonic sensor, tilt sensor, tachometer, ultra-wideband radar, variable reluctance sensor, and velocity receiver.
- Imaging, optical and light sensors, such as, but not limited to, CMOS sensor, LIDAR, multi-spectral light sensor, colorimeter, contact image sensor, electro-optical sensor, infra-red sensor, kinetic inductance detector, LED as light sensor, light-addressable potentiometric sensor, Nichols radiometer, fiber-optic sensors, optical position sensor, thermopile laser sensor, photodetector, photodiode, photomultiplier tubes, phototransistor, photoelectric sensor, photoionization detector, photomultiplier, photoresistor, photoswitch, phototube, scintillometer, Shack-Hartmann, single-photon avalanche diode, superconducting nanowire single-photon detector, transition edge sensor, visible light photon counter, and wavefront sensor.
- Pressure sensors, such as, but not limited to, barograph, barometer, boost gauge, bourdon gauge, hot filament ionization gauge, ionization gauge, McLeod gauge, Oscillating U-tube, permanent downhole gauge, piezometer, Pirani gauge, pressure sensor, pressure gauge, tactile sensor, and time pressure gauge.
- Force, Density, and Level sensors, such as, but not limited to, bhangmeter, hydrometer, force gauge or force sensor, level sensor, load cell, magnetic level or nuclear density sensor or strain gauge, piezocapacitive pressure sensor, piezoelectric sensor, torque sensor, and viscometer.
- Thermal and temperature sensors, such as, but not limited to, bolometer, bimetallic strip, calorimeter, exhaust gas temperature gauge, flame detection/pyrometer, Gardon gauge, Golay cell, heat flux sensor, microbolometer, microwave radiometer, net radiometer, infrared/quartz/resistance thermometer, silicon bandgap temperature sensor, thermistor, and thermocouple.
- Proximity and presence sensors, such as, but not limited to, alarm sensor, doppler radar, motion detector, occupancy sensor, proximity sensor, passive infrared sensor, reed switch, stud finder, triangulation sensor, touch switch, and wired glove.
- Consistent with the embodiments of the present disclosure, the
aforementioned computing device 500 may employ the peripherals sub-module 562 as a subset of the I/O 560. The peripheral sub-module 565 comprises ancillary devices uses to put information into and get information out of thecomputing device 500. There are 3 categories of devices comprising the peripheral sub-module 565, which exist based on their relationship with thecomputing device 500, input devices, output devices, and input/output devices. Input devices send at least one of data and instructions to thecomputing device 500. Input devices can be categorized based on, but not limited to: -
- Modality of input, such as, but not limited to, mechanical motion, audio, visual, and tactile.
- Whether the input is discrete, such as but not limited to, pressing a key, or continuous such as, but not limited to position of a mouse.
- The number of degrees of freedom involved, such as, but not limited to, two-dimensional mice vs three-dimensional mice used for Computer-Aided Design (CAD) applications.
- Output devices provide output from the
computing device 500. Output devices convert electronically generated information into a form that can be presented to humans. Input/output devices perform that perform both input and output functions. It should be understood by a person having ordinary skill in the art that the ensuing are non-limiting embodiments of the aforementioned peripheral sub-module 565: -
-
- Human Interface Devices (HID), such as, but not limited to, pointing device (e.g., mouse, touchpad, joystick, touchscreen, game controller/gamepad, remote, light pen, light gun, Wii remote, jog dial, shuttle, and knob), keyboard, graphics tablet, digital pen, gesture recognition devices, magnetic ink character recognition, Sip-and-Puff (SNP) device, and Language Acquisition Device (LAD).
- High degree of freedom devices, that require up to six degrees of freedom such as, but not limited to, camera gimbals, Cave Automatic Virtual Environment (CAVE), and virtual reality systems.
- Video Input devices are used to digitize images or video from the outside world into the
computing device 500. The information can be stored in a multitude of formats depending on the user's requirement. Examples of types of video input devices include, but not limited to, digital camera, digital camcorder, portable media player, webcam, Microsoft Kinect, image scanner, fingerprint scanner, barcode reader, 3D scanner, laser rangefinder, eye gaze tracker, computed tomography, magnetic resonance imaging, positron emission tomography, medical ultrasonography, TV tuner, and iris scanner. - Audio input devices are used to capture sound. In some cases, an audio output device can be used as an input device, in order to capture produced sound. Audio input devices allow a user to send audio signals to the
computing device 500 for at least one of processing, recording, and carrying out commands. Devices such as microphones allow users to speak to the computer in order to record a voice message or navigate software. Aside from recording, audio input devices are also used with speech recognition software. Examples of types of audio input devices include, but not limited to microphone, Musical Instrumental Digital Interface (MIDI) devices such as, but not limited to a keyboard, and headset. - Data Acquisition (DAQ) devices convert at least one of analog signals and physical parameters to digital values for processing by the
computing device 500. Examples of DAQ devices may include, but not limited to, Analog to Digital Converter (ADC), data logger, signal conditioning circuitry, multiplexer, and Time to Digital Converter (TDC). Output Devices may further comprise, but not be limited to: - Display devices, which convert electrical information into visual form, such as, but not limited to, monitor, TV, projector, and Computer Output Microfilm (COM). Display devices can use a plurality of underlying technologies, such as, but not limited to, Cathode-Ray Tube (CRT), Thin-Film Transistor (TFT), Liquid Crystal Display (LCD), Organic Light-Emitting Diode (OLED), MicroLED, E Ink Display (ePaper) and Refreshable Braille Display (Braille Terminal).
- Printers, such as, but not limited to, inkjet printers, laser printers, 3D printers, solid ink printers and plotters.
-
- Audio and Video (AV) devices, such as, but not limited to, speakers, headphones, amplifiers and lights, which include lamps, strobes, DJ lighting, stage lighting, architectural lighting, special effect lighting, and lasers.
- Other devices such as Digital to Analog Converter (DAC)
- Input/Output Devices may further comprise, but not be limited to, touchscreens, networking device (e.g., devices disclosed in
network 562 sub-module), data storage device (non-volatile storage 561), facsimile (FAX), and graphics/sound cards. - All rights including copyrights in the code included herein are vested in and the property of the Applicant. The Applicant retains and reserves all rights in the code included herein, and grants permission to reproduce the material only in connection with reproduction of the granted patent and for no other purpose.
- While the specification includes examples, the disclosure's scope is indicated by the following claims. Furthermore, while the specification has been described in language specific to structural features and/or methodological acts, the claims are not limited to the features or acts described above. Rather, the specific features and acts described above are disclosed as examples for embodiments of the disclosure.
- Insofar as the description above and the accompanying drawing disclose any additional subject matter that is not within the scope of the claims below, the disclosures are not dedicated to the public and the right to file one or more applications to claims such additional disclosures is reserved.
Claims (20)
1. A system for an automated matching of employment candidates to an employer, comprising:
a processor of a matching server (MS) node configured to host a machine learning (ML) module and connected to an employer entity node and to at least one candidate entity node over a network and;
a memory on which are stored machine-readable instructions that when executed by the processor, cause the processor to:
receive employment request data from the employer entity node;
parse the employment request data to derive a plurality of features;
query a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features;
generate at least one feature vector based on the plurality of features and the historical candidate-related data;
provide the at least one feature vector to the ML module configured to generate a predictive model to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node;
continuously monitor incoming employment request data to determine if at least one value of employment request parameters deviates from a previous value of a previous employment request data by a margin exceeding a pre-set threshold value; and
responsive to the at least one value deviating from the previous value by the margin exceeding the pre-set threshold value, generate an updated feature vector based on the incoming employment request data and generate the notification based on the at least one employment parameter produced by the predictive model in response to the updated feature vector.
2. The system of claim 1 , wherein the instructions further cause the processor to generate at least one rescheduling parameter for resetting a hiring schedule associated with the employment request data based on the at least one employment parameter.
3. The system of claim 1 , wherein the instructions further cause the processor to retrieve remote historical candidate-related data from at least one remote candidates' database based on the local historical candidate-related data, wherein the remote historical candidate-related data is collected at locations associated with a plurality of employer entities affiliated with the employer entity node.
4. The system of claim 3 , wherein the instructions further cause the processor to generate the at least one feature vector based on the plurality of features, the local historical candidate-related data combined with the remote historical candidate-related data.
5. The system of claim 1 , wherein the instructions further cause the processor to parse the employment request data to generate a plurality of hashtags.
6. The system of claim 5 , wherein the instructions further cause the processor to generate the plurality of features based on the plurality of hashtags.
7. (canceled)
8. (canceled)
9. The system of claim 1 , wherein the instructions further cause the processor to record the at least one employment parameter on a blockchain ledger along with the employment request data.
10. The system of claim 9 , wherein the instructions further cause the processor to retrieve the at least one employment parameter from the blockchain responsive to a consensus among the employer entity node and the at least one candidate entity node.
11. The system of claim 2 , wherein the instructions further cause the processor to execute a smart contract to record data reflecting the resetting the hiring schedule of a candidate associated with the at least one candidate entity node on a blockchain for future audits.
12. A method for an automated matching of employment candidates to an employer, comprising:
receiving, by a matching server (MS) node configured to host a machine-learning (ML) module, employment request data from an employer entity node;
parsing, by the matching server (MS) node, the employment request data to derive a plurality of features;
querying, by the matching server (MS) node, a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features;
generating, by the matching server (MS) node, at least one feature vector based on the plurality of features and the historical candidate-related data; and
providing, by the matching server (MS) node, the at least one feature vector to the ML module configured to generate a predictive model to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node;
continuously monitoring, by the MS node, incoming employment request data to determine if at least one value of employment request parameters deviates from a previous value of a previous employment request data by a margin exceeding a pre-set threshold value; and
responsive to the at least one value deviating from the previous value by the margin exceeding the pre-set threshold value, generating, by the MS node, an updated feature vector based on the incoming employment request data and generating the notification based on the at least one employment parameter produced by the predictive model in response to the updated feature vector.
13. The method of claim 12 , further comprising retrieving remote historical candidate-related data from at least one remote candidates' database based on the local historical candidate-related data, wherein the remote historical candidate-related data is collected at locations associated with a plurality of employer entities affiliated with the employer entity node.
14. The method of claim 13 , further comprising generating the at least one feature vector based on the plurality of features, the local historical candidate-related data combined with the remote historical candidate-related data.
15. (canceled)
16. (canceled)
17. The method of claim 12 , further comprising, recording the at least one employment parameter on a blockchain ledger along with the employment request data.
18. A non-transitory computer readable medium comprising instructions, that when read by a processor, cause the processor to perform:
receiving employment request data from an employer entity node;
parsing the employment request data to derive a plurality of features;
querying a local candidates database to retrieve local historical candidate-related data collected at a location of previous employment based on the plurality of features;
generating at least one feature vector based on the plurality of features and the historical candidate-related data;
providing the at least one feature vector to a machine-learning module configured to generate a predictive model to produce at least one employment parameter for generation of an employment-related notification to the at least one candidate entity node;
continuously monitoring incoming employment request data to determine if at least one value of employment request parameters deviates from a previous value of a previous employment request data by a margin exceeding a pre-set threshold value; and
responsive to the at least one value deviating from the previous value by the margin exceeding the pre-set threshold value, generating an updated feature vector based on the incoming employment request data and generating the notification based on the at least one employment parameter produced by the predictive model in response to the updated feature vector.
19. (canceled)
20. (canceled)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/210,273 US20240420090A1 (en) | 2023-06-15 | 2023-06-15 | System and method for ai-based matching of employment candidates |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/210,273 US20240420090A1 (en) | 2023-06-15 | 2023-06-15 | System and method for ai-based matching of employment candidates |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20240420090A1 true US20240420090A1 (en) | 2024-12-19 |
Family
ID=93844299
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/210,273 Pending US20240420090A1 (en) | 2023-06-15 | 2023-06-15 | System and method for ai-based matching of employment candidates |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US20240420090A1 (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240086468A1 (en) * | 2021-03-30 | 2024-03-14 | Sureprep, Llc | Document Matching Using Artificial Intelligence |
-
2023
- 2023-06-15 US US18/210,273 patent/US20240420090A1/en active Pending
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240086468A1 (en) * | 2021-03-30 | 2024-03-14 | Sureprep, Llc | Document Matching Using Artificial Intelligence |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240358331A1 (en) | Method and system for ai-based analysis of respiratory conditions | |
| US20250131382A1 (en) | Machine learning-based recruiting system | |
| US20230230685A1 (en) | Intelligent Matching Of Patients With Care Workers | |
| US20230334163A1 (en) | Protection of documents by qr code-based stamp | |
| US20210272222A1 (en) | System and methods for tracking authorship attribution and creating music publishing agreements from metadata | |
| US20210374741A1 (en) | Compliance controller for the integration of legacy systems in smart contract asset control | |
| US20250293947A1 (en) | Ai-based system and method for establishing channelized communications | |
| US20220058582A1 (en) | Technical specification deployment solution | |
| US20220215492A1 (en) | Systems and methods for the coordination of value-optimizating actions in property management and valuation platforms | |
| US20250094468A1 (en) | Method and system for ai-based wedding planning platform | |
| US12170131B2 (en) | System for determining clinical trial participation | |
| US20240127142A1 (en) | Method and platform for providing curated work opportunities | |
| US12443952B2 (en) | Management platform for community association MGCOne online platform and marketplace | |
| US20250103853A1 (en) | System and method for ai-based object recognition | |
| US20230071263A1 (en) | System and methods for tracking authorship attribution and creating music publishing agreements from metadata | |
| US20240420090A1 (en) | System and method for ai-based matching of employment candidates | |
| US20250156942A1 (en) | System and method for ai-based loan processing | |
| US20250117817A1 (en) | System and method for ai-based recommendations based on leads | |
| US20250061493A1 (en) | Method and system for ai-based property evaluation | |
| US20250372240A1 (en) | System and method for ai-based universal healthcare platform | |
| US20250291853A1 (en) | System and method for ai-based social groups management | |
| US20250260702A1 (en) | System and method for ai-based intrusion behaviour analysis | |
| US20240403966A1 (en) | System and method for predictive market place | |
| US20240387040A1 (en) | System and method for ai-based monitoring of patients | |
| US20250157649A1 (en) | System and method for ai-based diagnosis |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION COUNTED, NOT YET MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |