CN119415554B - Method and device for constructing a learning query optimizer based on two-stage search - Google Patents
Method and device for constructing a learning query optimizer based on two-stage searchInfo
- Publication number
- CN119415554B CN119415554B CN202411574118.5A CN202411574118A CN119415554B CN 119415554 B CN119415554 B CN 119415554B CN 202411574118 A CN202411574118 A CN 202411574118A CN 119415554 B CN119415554 B CN 119415554B
- Authority
- CN
- China
- Prior art keywords
- plan
- query
- execution
- training
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2453—Query optimisation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2455—Query execution
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a method and a device for constructing a learning type query optimizer based on two-stage search, wherein the method comprises the steps of verifying whether a user query is in an applicable domain of the optimizer by using training data through an applicable domain detector based on a Markov distance metric, processing and executing the query by a database management system at the bottom layer if the user query is not in the applicable domain, generating optimal k plans as candidate sets by using an existing plan generator based on beam search in the applicable domain, training a selector by using an execution plan after data enhancement, and predicting predicted delay of the candidate plans by using a regression model by using the plan selector so as to select an optimal query plan and executing the optimal query plan by the database management system. The invention provides an accurate, efficient and robust query optimizer by utilizing the two-stage searching and applicable domain detection modes, has the advantages of high quality of a yield plan, high training efficiency, strong robustness and the like, and has wide application scenes.
Description
Technical Field
The application relates to the technical field of information retrieval, in particular to a method and a device for constructing a learning type query optimizer based on two-stage search.
Background
The query optimizer is one of the core components of the database management system, can determine a more efficient execution plan according to the input user query, and is a component which deeply influences the processing capacity of the database query. The user structured query language (Structured Query Language, SQL) query, whose goal is input, selects the physical execution plan that has the lowest system resource consumption (i.e., execution cost, generally referred to as execution time).
Because there are numerous execution modes for calculating the query results for the same SQL query statement, the different execution modes may differ by several orders of magnitude from the viewpoint of execution time, which makes the selection of the execution plan play a vital role in the database query processing efficiency. However, due to the reasons of complex service, complicated SQL logic, difficult acquisition of information such as data distribution and the like, database application developers have difficulty in writing SQL sentences and execution plans for efficient execution, which puts high demands on the performance of a query optimizer.
The main database management systems today all use traditional query optimizers based on statistical information such as histograms, sampling and the like. However, the design of the conventional query optimizer is mainly based on an ideal data distribution assumption, and the data distribution of the real scene is difficult to meet the assumption, so that the conventional optimizer does not perform well in practical applications. Stimulated by the excellent characterization and learning capabilities of machine learning, a large number of learning-type query optimizers have emerged in recent years that can significantly improve the quality of the produced execution plan, but their robustness remains to be improved, mainly in (1) in the cost model relied on by the learning-type query optimizers, errors are unavoidable and have transmissibility, which will affect the effectiveness of the query optimizers. (2) Existing learning-based query optimizers lack a defined scope of applicability, which affects the stability of the system. (3) The high training costs also affect the scalability of such optimizers. Therefore, how to design a robust query optimization algorithm, meeting the requirements of system stability, effectiveness and scalability is a primary challenge.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for constructing a learning type query optimizer based on two-stage search, which can provide stable and effective query optimization service with lower training cost and improve the robustness of the whole system.
According to a first aspect of an embodiment of the present invention, there is provided a method for constructing a learning type query optimizer based on two-stage search, including:
Collecting inquiry and execution condition information of a database management system on a target data set;
Constructing a training set according to the inquiry and the execution condition information thereof;
Constructing an applicable domain detector according to the training set;
Constructing and training a plan generator based on beam search according to the training set, and storing different training plans generated in a training stage and corresponding execution conditions thereof into an experience pool by using a data reuse and data enhancement technology;
constructing and training a plan selector according to the data in the experience pool;
and constructing and obtaining a learning type query optimizer according to the applicable domain detector, the plan generator and the plan selector.
According to a second aspect of an embodiment of the present invention, there is provided a learning query optimizer building device based on two-stage search, including:
The data collection module is used for collecting the inquiry positioned on the target data set and the execution condition information of the inquiry positioned on the target data set in the database management system;
the training set construction module is used for constructing a training set according to the inquiry and the execution condition information;
The detector construction module is used for constructing an applicable domain detector according to the training set;
The generator construction module is used for constructing and training a plan generator based on beam search according to the training set, and storing different training plans generated in a training stage and corresponding execution conditions thereof into an experience pool by using a data reuse and data enhancement technology;
the selector construction module is used for constructing and training a plan selector according to the data in the experience pool;
and the optimizer construction module is used for constructing and obtaining a learning type query optimizer according to the applicable domain detector, the plan generator and the plan selector.
According to a third aspect of the embodiment of the present invention, there is provided a database management system optimal plan generating method, including:
inputting the user query into a learning type query optimizer to obtain a corresponding optimal plan, wherein the learning type query optimizer is constructed by the method of the first aspect.
According to a fourth aspect of an embodiment of the present invention, there is provided a database management system optimal plan generating apparatus including:
And the plan generation module is used for inputting the user query into the learning type query optimizer to obtain a corresponding optimal plan, and the learning type query optimizer is constructed by the method of the first aspect.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
According to the method, the problem that the traditional learning type query optimizer is greatly influenced by the precision error of the cost model and has high training cost is solved, the effects of improving the efficiency, the accuracy, the robustness and the expansibility in all aspects are achieved, the method for reusing and enhancing the data of the query data generated by the first-stage plan generator is adopted, the training data collection time required by the traditional learning type query optimizer is long, the training efficiency of the model is improved, the method can be deployed at lower cost, the problem that the traditional learning type query optimizer lacks a definition application range is solved, and the stability and the applicability of the system are improved.
The model designed by the application provides a more robust, stable, efficient and practical solution for solving the problem of query optimization in a database management system.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of a method for constructing a learning type query optimizer based on two-stage search according to an embodiment of the present invention.
FIG. 2 is an overall framework diagram of a learning query optimizer of an embodiment of the present invention.
FIG. 3 is a diagram of a planning tree coding and model structure framework of an embodiment of the present invention.
FIG. 4 is a block diagram of a two-stage search based learning query optimizer device in accordance with an embodiment of the present invention.
Fig. 5 is a flowchart of a database management system optimal plan generation method according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. The term "if" as used herein may be interpreted as "at..once" or "when..once" or "in response to a determination", depending on the context.
Example 1
Fig. 1 is a flowchart of a method for constructing a learning-type query optimizer based on two-stage search according to an embodiment of the present invention, fig. 2 is an overall frame diagram of the learning-type query optimizer according to an embodiment of the present invention, and fig. 3 is a frame diagram of a planning tree coding and model structure according to an embodiment of the present invention, the method includes the following steps:
Step S100, collecting inquiry and execution condition information of the inquiry positioned on a target data set in a database management system, and specifically comprising the following steps:
And step S101, screening the historical query log information of the database management system to obtain query information in the target data set, wherein the query information comprises selection list information, connection information and predicate information.
Specifically, the PostgreSQL database management system is taken as an example. The target data set D is first determined to be the data of a particular database. Second, query statements executing on this database and their query information are collected from the SQL statement log in PostgreSQL. Taking query statement Q as an example, wherein the dimension of target data set D is D, parsing database history query log information to extract selected n tables (relationships) T (containingTo the point of) Extracting from the attribute selection portion of the Q statementM columns (Properties) in the TableTo the point ofPredicate information p is extracted from a predicate screening portion of the Q statement.
Step S102, query execution condition information obtained from historical query log information of a database management system comprises an execution plan corresponding to the query and actual execution cost of the plan, namely execution time of the plan;
Specifically, the PostgreSQL database management system is taken as an example. The execution condition of the current query is obtained from the SQL statement log in the PostgreSQL, wherein the execution condition comprises an execution plan P corresponding to the query and the actual execution cost C of the plan, namely the execution time of the plan.
And step 200, constructing a training set according to the inquiry and the execution condition information.
Step S201, extracting the relevant information (including selection list information, connection information and screening predicate information) of the query statement Q extracted in step S101, and extracting the execution condition information corresponding to the query in step S102, namely an execution plan P corresponding to the query and an actual execution cost C of the plan, forming a < Q, P, C > triplet in a corresponding manner, storing in an experience pool, and marking as a training setAnd (5) standby.
Step S202, continuously circulating the step S201 until the training set part in the experience poolThe number of < Q, P, C > triples stored is sufficient to complete the cycle after model training.
Step S300, constructing an applicable domain detector according to the training set, wherein the method specifically comprises the following steps:
Step S301, selecting training set The user query data Q in the database is input into a query optimizer of the database, and the generated corresponding execution plan is obtained。
Step S302, each sample in the training set is connected with the corresponding plan, and the samples are encoded into one-dimensional vector data z with equal length.
Step S303, constructing an applicable domain detector by means of the obtained one-dimensional vector data z and the Markov distance calculation part, and screening the query input into the optimizer. The method comprises selecting threshold valueAnd the number of sample samplesWhen the mahalanobis distance of the query from the selected number of samples is less than the threshold, the query will be considered as an applicable in-domain query, and otherwise the query will be classified as an applicable out-of-domain query. Here we recommendThe model performs best at this time with a value of 500. The introduction of the applicable domain detector can help the model to solve different types of queries separately, so that the model is only responsible for processing and executing the applicable queries, and the worst performance of the model is ensured not to be worse than that of a common database management system, thereby playing the role of improving the efficiency, the robustness and the practicability of the model.
Step S400, constructing and training a plan generator based on beam search according to the training set, and storing different training plans and corresponding execution conditions thereof generated in a training stage into an experience pool by using a data reuse and data enhancement technology, wherein the method specifically comprises the following steps:
step S401, constructing a learning value network model V, the goal of which is to estimate the sub-plans of the input query Is the Overall Cost (Overall Cost). The overall cost of a sub-plan refers to the smallest execution plan of all complete plans of the current query that contains the current sub-plan, and this index represents the potential of the sub-plan, i.e., the smaller the overall plan, the greater the potential for the sub-plan to develop into an optimal plan.
Step S402, setting a beam width b of a plan generator and a target candidate plan number parameter k. Where we recommend b to be set to 20 and k to be set to 10. Under this setting, the model performs excellently, and the initial plan is initializedPlan collectionAnd complete plan set. Specifically, an initial planEmpty plans set to unoperated, plan collectionIs set as {Complete plan setIs set to an empty set.
Step S403, embedding the user inquiry in the training set into a vector with uniform size, and inputting the vector into a plan generator to train the vector;
step S404, when the number of complete plans is smaller than the target number k, continuously expanding the plan set in a bottom-up mode Non-complete plans in (a), and storing the complete plans into a complete plan setIs a kind of medium.
Step S405. Aggregating plansAll elements in the table are ordered from big to small according to the total cost, only the first b elements are reserved, and other elements can be abandoned;
Step S406, repeating steps S405 and S406 until the number of complete plans=k, continuously training the plan generator by using the process, and storing various training data generated in the process, including user query Q and execution plan Total execution time C, sub-plan execution timeAnd after being expanded by the data enhancement technology, the data are stored in an experience pool and used as training samples of a plan selector. The data multiplexing and enhancing process has the effects of reducing the collection time of training samples, expanding the number of the training samples and improving the training efficiency and effect of the model.
Step S500, constructing and training a plan selector according to the data in the experience pool, and specifically comprising the following steps:
Step S501, constructing a plan selector M (see FIG. 3 for details) comprising three tree convolutional neural networks, one pooling layer and two fully connected layers, predicting execution time of a query and predicted data noise according to the input query and plan . The three-layer tree convolution neural network structure layers are respectively 1×256, 1×128 and 1×64, the neural network structure layer number of the pooling layer is 1×64, and the neural network structure layer number of the full-connection layer is respectively 1×32 and 1×1. In addition, the model introduces an additional model F to learn the data noiseDisturbance of the prediction result, where model F and model M share parameters except for the last fully connected layer. The anti-interference capability of the model can be obviously improved through the noise learning, so that the model is more suitable for actual application scenes, and the robustness and the practicability of the model are improved.
Step S502, training a plan selector by using an execution plan, an execution sub-plan and corresponding cost information stored in an experience pool;
The execution plan P and the execution sub-plan in the training set and the experience pool And the corresponding cost C to make up) Is input into the plan picker for training.
Step S503, representing the query plan by means of a feature vector tree, and representing the specific information of the sub-plan on each sub-node by using the operation symbols after single Hot (One Hot) coding and related relation table, normalized radix estimation and cost estimation information;
Planning The method is characterized by converting the form of a plan tree into codes, wherein each operator and related relation table are coded in the form of One Hot, and the base estimation result and cost estimation information corresponding to the plan are characterized after normalization.
Step S504, predicting the execution cost corresponding to the execution planAnd real execution costInput to the heteroscedastic regression loss functionThe plan picker is trained by minimizing the heteroscedastic loss function value over the training set, resulting in an optimal parameter. The specific loss function is as follows:
wherein N is input ) Is used in the number of (a) and (b),Representing a disturbance of the data itself of sample i. Thus, a sample with large disturbance indicates that the sample mass is relatively low, and thus byThe weighting coefficient of the sample is reduced. This allows the loss function guided model to focus more on samples with small disturbances, reducing the impact of samples with large disturbances. For the second term in the loss function, addThe penalty term of (2) can prevent the model from being continuously liftedTo reduce losses. It should be noted that the number of the components,There is no so-called label, but it can be automatically learned by the loss function.
Step S600, constructing a whole learning type query optimizer according to the applicable domain detector, the plan generator and the plan selector, wherein the method specifically comprises the following steps:
And step S601, using the applicable domain detector as a top-level module of the query optimizer, wherein the top-level module can process the user query input into the query optimizer and input output results into the database management system and the plan generator respectively.
Step S602, a plan generator is taken as a middle layer module of the query optimizer, the module generates a plan for the input query in the applicable domain, and the output result is input into the plan selector.
Step S603, the plan selector is taken as a bottom layer module of the query optimizer, the module selects the input plan and inputs the output result into the database management system.
Correspondingly, as shown in fig. 4, the embodiment of the invention further provides a learning type query optimizer device based on two-stage search, which comprises:
A data collection module 101, configured to collect queries and execution information thereof located on a target data set in a database management system;
The training set construction module 102 is configured to construct a training set according to the query and the execution condition information thereof;
a detector construction module 103, configured to construct an applicable domain detector according to the training set;
the generator construction module 104 is configured to construct and train the plan generator based on the beam search according to the training set, and store different training plans generated in the training stage and corresponding execution conditions thereof into an experience pool by using a data reuse and data enhancement technology;
a selector building module 105 for building and training a plan selector based on the data in the experience pool;
And the optimizer construction module 106 is configured to construct a learning type query optimizer according to the applicable domain detector, the plan generator and the plan selector.
Example two
Referring to fig. 5, the present embodiment provides a database management system optimal plan generating method, including:
S1, inputting a user query into a learning type query optimizer to obtain a corresponding optimal plan, wherein the learning type query optimizer is constructed by the method of the first aspect, and the method comprises the following substeps:
step S11, inputting a query, screening the query by using an applicable domain detector ADV, directly inputting the query outside the applicable domain into a traditional database management system for execution, and inputting the query inside the applicable domain into the TPG. Specifically, the ADV encodes each piece of data in the input samples (Q, P) as a variable of the same length as the training set. The ADV will then calculate this sample into a set of N training samples Is a mahalanobis distance. The dimension of the code vector is expressed as d, the code vector of the test sample is expressed as Z, the code matrix of the training set of N training samples with the shape of 1×d is expressed as Z, the shape of N×d is calculated, the mean mu of Z and the covariance matrix sigma are calculated, and then the mahalanobis distance is obtainedThe calculation formula of (2) is as follows:
。
Then, if At γ, it is considered to be within the applicable domain. Otherwise, it is classified as being outside the applicable.
Step S12, according to the trained plan generator, a query Q in an applicable domain is input, and the TPG firstly generates k candidate plans P with the highest potential by using the first-stage query generator based on beam search.
Step S13, the plan selector selects the optimal plan from the candidate plans PThe optimal plan is executed by the underlying database management system.
Correspondingly, the embodiment of the invention also provides a database management system optimal plan generating device, which comprises:
And the plan generation module is used for inputting the user query into the learning type query optimizer to obtain a corresponding optimal plan, and the learning type query optimizer is constructed by the method of the first aspect.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present application without undue burden.
Correspondingly, the application further provides electronic equipment, which comprises one or more processors, a memory and a database management system optimal plan generation method, wherein the memory is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors realize the two-stage search-based learning type query optimizer construction method or the database management system optimal plan generation method.
Correspondingly, the application also provides a computer readable storage medium, wherein computer instructions are stored on the computer readable storage medium, and the instructions are executed by a processor to realize the learning type query optimizer building method based on the two-stage search or the database management system optimal plan generating method.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (5)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411574118.5A CN119415554B (en) | 2024-11-06 | 2024-11-06 | Method and device for constructing a learning query optimizer based on two-stage search |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411574118.5A CN119415554B (en) | 2024-11-06 | 2024-11-06 | Method and device for constructing a learning query optimizer based on two-stage search |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119415554A CN119415554A (en) | 2025-02-11 |
| CN119415554B true CN119415554B (en) | 2025-10-21 |
Family
ID=94469156
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411574118.5A Active CN119415554B (en) | 2024-11-06 | 2024-11-06 | Method and device for constructing a learning query optimizer based on two-stage search |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119415554B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111611274A (en) * | 2020-05-28 | 2020-09-01 | 华中科技大学 | A database query optimization method and system |
| CN114637775A (en) * | 2022-03-29 | 2022-06-17 | 哈尔滨工业大学 | Query optimization system, method and equipment based on Monte Carlo tree search and reinforcement learning |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115392477B (en) * | 2022-07-26 | 2025-11-14 | 浙江大学 | A Deep Learning-Based Method and Apparatus for Estimating Cardinality of Skyline Queries |
| US12001432B1 (en) * | 2022-09-06 | 2024-06-04 | Teradata Us, Inc. | Reducing query optimizer plan regressions with machine learning classification |
| CN117390063A (en) * | 2023-11-28 | 2024-01-12 | 电子科技大学 | Listwise ordering learning-based database querier optimization method |
| CN118520008B (en) * | 2024-07-25 | 2024-10-11 | 浙江大学 | An intelligent query optimization method and system for Spark SQL |
-
2024
- 2024-11-06 CN CN202411574118.5A patent/CN119415554B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111611274A (en) * | 2020-05-28 | 2020-09-01 | 华中科技大学 | A database query optimization method and system |
| CN114637775A (en) * | 2022-03-29 | 2022-06-17 | 哈尔滨工业大学 | Query optimization system, method and equipment based on Monte Carlo tree search and reinforcement learning |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119415554A (en) | 2025-02-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110597735B (en) | A software defect prediction method for deep learning of open source software defect features | |
| CN103838857B (en) | Automatic service combination system and method based on semantics | |
| KR20200010624A (en) | Big Data Integrated Diagnosis Prediction System Using Machine Learning | |
| CN119576977B (en) | Natural language SQL conversion method based on data platform and large language model | |
| CN117593044A (en) | A dual-angle marketing activity effect prediction method, medium and system | |
| CN112380243B (en) | SQL query selectivity estimation method based on machine learning | |
| CN117390063A (en) | Listwise ordering learning-based database querier optimization method | |
| CN117932122A (en) | Graph database query method based on graph embedding distance prediction subgraph matching algorithm | |
| WO2021143686A1 (en) | Neural network fixed point methods and apparatuses, electronic device, and readable storage medium | |
| CN112270058A (en) | Optical network multi-channel transmission quality prediction method based on echo state network | |
| CN120163113B (en) | Wafer-level chip system design space construction and rapid parameter searching method | |
| CN119415554B (en) | Method and device for constructing a learning query optimizer based on two-stage search | |
| CN119691159B (en) | Technological topic evolution stage prediction method and system based on multiple graph representation | |
| CN119646196B (en) | A search enhancement generation optimization method, system, device, product and medium | |
| CN120670565A (en) | Knowledge retrieval candidate library generation method and system based on incremental pre-training optimization | |
| CN119202772B (en) | Traffic state complement prediction method based on multi-source data pre-training | |
| CN119337151A (en) | A method and device for constructing an evaluation scenario set for an intelligent decision-making system based on a case-constrained large model | |
| CN119003617A (en) | Distributed query service oriented optimization method | |
| CN112801264B (en) | A Dynamically Differentiable Spatial Architecture Search Method and System | |
| CN117689865A (en) | Target detection method and system based on feature and fusion mode search | |
| CN116310636B (en) | A Lightweight Neural Network Structure Search Method Based on Neural Network Topology | |
| CN115080921B (en) | Improved Top-k dosing method based on audit sensitivity | |
| CN120874833A (en) | A Method and System for Constructing Dynamic Job Requirement Profiles Based on Knowledge Graphs | |
| CN120353828A (en) | Query plan acquisition method, electronic device, storage medium, and program product | |
| CN119884192A (en) | Database query task execution method, device, equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |