[go: up one dir, main page]

CN116977299A - Automated control system for casting and method thereof - Google Patents

Automated control system for casting and method thereof Download PDF

Info

Publication number
CN116977299A
CN116977299A CN202310880918.9A CN202310880918A CN116977299A CN 116977299 A CN116977299 A CN 116977299A CN 202310880918 A CN202310880918 A CN 202310880918A CN 116977299 A CN116977299 A CN 116977299A
Authority
CN
China
Prior art keywords
casting
image
feature
feature map
casting surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310880918.9A
Other languages
Chinese (zh)
Inventor
王建强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
YIYANG HONGXING MACHINERY EQUIPMENT CO Ltd
Original Assignee
YIYANG HONGXING MACHINERY EQUIPMENT CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by YIYANG HONGXING MACHINERY EQUIPMENT CO Ltd filed Critical YIYANG HONGXING MACHINERY EQUIPMENT CO Ltd
Priority to CN202310880918.9A priority Critical patent/CN116977299A/en
Publication of CN116977299A publication Critical patent/CN116977299A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30116Casting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30136Metal

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an automatic control system for casting and a method thereof, wherein a camera is used for collecting a surface shooting image of a casting; extracting image features of the surface shooting image to obtain a casting surface feature map; and determining whether the appearance of the casting meets the preset requirement or not based on the casting surface characteristic diagram. Therefore, the automation level of the casting process can be improved, the interference of human factors is reduced, and the quality and the production efficiency of castings are improved.

Description

Automated control system for casting and method thereof
Technical Field
The application relates to the technical field of intelligent control, in particular to an automatic control system and method for casting.
Background
Casting is a common manufacturing process used to produce a variety of metal and alloy articles. In conventional casting processes, it is often necessary to rely on an operator to perform an appearance inspection of the casting to determine whether the casting meets product benchmarking requirements. Such inspection is typically performed prior to shipment of the castings to ensure quality and appearance of the product.
However, in casting equipment, a plurality of kinds of products are often manufactured, which places a burden on operators who give pre-factory inspection areas. The operator needs to determine which product each casting belongs to and determine whether its appearance meets the corresponding benchmark requirements. This process requires a lot of time and effort and is susceptible to subjective factors.
Accordingly, an optimized automated control system for casting is desired that automatically performs quality inspection of the appearance of the cast casting to reduce interference from artifacts and improve the quality and production efficiency of the cast casting.
Disclosure of Invention
The embodiment of the application provides an automatic control system for casting and a method thereof, wherein a camera is used for collecting a surface shooting image of a casting; extracting image features of the surface shooting image to obtain a casting surface feature map; and determining whether the appearance of the casting meets the preset requirement or not based on the casting surface characteristic diagram. Therefore, the automation level of the casting process can be improved, the interference of human factors is reduced, and the quality and the production efficiency of castings are improved.
The embodiment of the application also provides an automatic control system for casting, which comprises:
the image acquisition module is used for acquiring a surface shooting image of the casting through the camera;
the image feature analysis module is used for extracting image features of the surface shot image to obtain a casting surface feature map; and
and the casting appearance detection module is used for determining whether the casting appearance meets the preset requirement or not based on the casting surface feature map.
The embodiment of the application also provides an automatic control method for casting, which comprises the following steps:
collecting a surface shooting image of the casting through a camera;
extracting image features of the surface shooting image to obtain a casting surface feature map; and
and determining whether the appearance of the casting meets the preset requirement or not based on the casting surface characteristic diagram.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. In the drawings:
FIG. 1 is a block diagram of an automated control system for casting provided in an embodiment of the present application.
FIG. 2 is a block diagram of the image feature analysis module in an automated control system for casting provided in an embodiment of the present application.
Fig. 3 is a flow chart of an automated control method for casting provided in an embodiment of the present application.
Fig. 4 is a schematic diagram of a system architecture of an automated control method for casting provided in an embodiment of the present application.
Fig. 5 is an application scenario diagram of an automated control system for casting according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings. The exemplary embodiments of the present application and their descriptions herein are for the purpose of explaining the present application, but are not to be construed as limiting the application.
Unless defined otherwise, all technical and scientific terms used in the embodiments of the application have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to limit the scope of the present application.
In describing embodiments of the present application, unless otherwise indicated and limited thereto, the term "connected" should be construed broadly, for example, it may be an electrical connection, or may be a communication between two elements, or may be a direct connection, or may be an indirect connection via an intermediate medium, and it will be understood by those skilled in the art that the specific meaning of the term may be interpreted according to circumstances.
It should be noted that, the term "first\second\third" related to the embodiment of the present application is merely to distinguish similar objects, and does not represent a specific order for the objects, it is to be understood that "first\second\third" may interchange a specific order or sequence where allowed. It is to be understood that the "first\second\third" distinguishing objects may be interchanged where appropriate such that embodiments of the application described herein may be practiced in sequences other than those illustrated or described herein.
It will be appreciated that casting is a common manufacturing process whereby molten metal or alloy is poured into prefabricated moulds and allowed to cool to solidify to provide the desired metal article, which can be used to manufacture parts and products of various shapes and sizes, from small to large and from simple to complex.
Casting is an important manufacturing process and is widely applied to various industries such as automobile manufacturing, aerospace, construction and the like. With the continuous development of technology, the casting process is also continuously improved and innovated so as to meet the requirements of different products.
In one embodiment of the present application, FIG. 1 is a block diagram of an automated control system for casting provided in an embodiment of the present application. As shown in fig. 1, an automated control system 100 for casting according to an embodiment of the present application includes: an image acquisition module 110 for acquiring a surface shot image of the casting through a camera; the image feature analysis module 120 is configured to perform image feature extraction on the surface shot image to obtain a casting surface feature map; and a casting appearance detection module 130 for determining whether the casting appearance meets a predetermined requirement based on the casting surface feature map.
The present application provides a casting method including a melting step, a molten metal transporting step, a pouring step, a sampling step, an analyzing step, a cooling step, a disassembling step, a finishing step, and an inspection step.
Wherein, the melting procedure:
first, a melting process is performed. In the melting step, a melting furnace melts a molten material with heat to obtain a raw liquid. Molten metal produced by the melting furnace is tapped to a processing ladle. The treatment ladle is conveyed by a liquid receiving trolley. The treatment ladle is transported to a switching position and switched to a ladle suitable for pouring (pouring ladle). The switching means transferring the molten metal to another ladle. The pouring ladle is conveyed by the conveying trolley. The melting furnace, the tapping trolley and the transfer trolley are controlled by the molten metal transfer PLC (Programmable Logic Controller).
Before tapping, for example, when a material for adjusting the composition of molten metal is poured, a ladle number is given to a treatment ladle in advance. The molten metal delivery PLC correlates information (molten metal information) about the molten metal tapped from the melting furnace to the timing of handling the ladle tapping with the ladle number. Address information specifying a location is assigned to a location where the ladle is movable in advance. The molten metal delivery PLC maintains the relationship between ladle sequence number and address information. For example, when the ladle to which the ladle number "1" is assigned is located at a location indicated by the address information "2", the molten metal delivery PLC associates the ladle number "1" with the address information "2". The molten metal delivery PLC updates the relationship every time the ladle moves. For example, when the ladle to which the ladle number "1" is assigned moves from the position indicated by the address information "2" to the position indicated by the address information "3", the molten metal delivery PLC associates the ladle number "1" with the address information "3". In this way, the molten metal delivery PLC can grasp a ladle located at an arbitrary place by switching the ladle number in data in accordance with the movement of the ladle, and can refer to the molten metal information associated with the ladle number of the ladle. In addition, when the molten metal in the treatment ladle is switched to the casting ladle, the molten metal delivery PLC delivers the ladle number of the treatment ladle to the casting ladle, and switches the ladle number in accordance with the movement of the casting ladle.
The molten metal delivery PLC transmits a ladle number and an automated control system for casting corresponding to the ladle number.
Wherein, the molten metal conveying process:
when the melting process is completed, a molten metal transporting process is performed. In the molten metal transporting process, a transporting carriage transports a casting ladle. The pouring ladle is conveyed by the conveying trolley. The transfer carriage can be stopped at a ladle replacement position for transferring the casting ladle to the casting machine, in addition to the above-described reversing position. The conveying trolley is controlled to operate by the molten metal conveying PLC.
A pouring ladle (solid ladle) filled with molten metal is transferred from a transfer trolley to a ladle changing device at a ladle changing position. In the ladle replacing apparatus, a solid ladle and a casting ladle (empty ladle) which is empty by casting are replaced. The ladle replacing device is controlled by a pouring PLC.
Wherein, the pouring process comprises the following steps:
if the molten metal transfer process is completed, a pouring process is performed. In the pouring step, the pouring machine injects molten metal in a pouring ladle into the mold. In the pouring step, molten metal is poured from one pouring ladle into a plurality of casting molds. The plurality of molds molded by the molding machine are arranged in a row and are transferred to the casting machine mold by mold. The casting machine casts molten metal in a ladle against a mold being transported. The casting machine is controlled by a casting PLC to act.
The pouring PLC can acquire the ladle number and the molten metal information corresponding to the ladle number via an automated control system for casting. Therefore, the pouring PLC can acquire molten metal information of the poured molten metal based on the ladle number given to the pouring ladle utilized at the time of pouring. The mold is given a mold number in advance before casting, for example, at the time of manufacturing in a molding machine. The pouring PLC correlates the molten metal information with the mold serial number at the time the pouring is performed. Address information specifying a place is assigned to a place where the mold is movable in advance. The casting PLC keeps the relation between the serial number of the casting mould and the address information. For example, when the mold to which the mold number "1" is assigned is located at the location indicated by the address information "2", the casting PLC associates the mold number "1" with the address information "2". The pouring PLC updates the relationship each time the mold moves. For example, when the mold to which the mold number "1" is assigned moves from the position indicated by the address information "2" to the position indicated by the address information "3", the casting PLC associates the mold number "1" with the address information "3". In this way, the casting PLC can grasp the mold located at an arbitrary place by switching the mold number in accordance with the movement of the mold, and can refer to the molten metal information associated with the mold number of the mold. The pouring PLC transmits the mold number and the molten metal information corresponding to the mold number to an automated control system for casting.
Wherein, sampling procedure:
the sampling process is performed before and after the pouring process or in the pouring process. In the sampling step, a casting machine takes molten metal in a casting ladle used in the casting step under the control of a casting PLC to generate a Test Piece (TP). The pouring PLC establishes the relation between the test piece number and the ladle number.
Wherein, the analysis procedure:
after the sampling process, an analysis process is performed at the texture inspection area. In the analysis step, an operator analyzes at least one of the composition and physical properties of the test piece. The analysis result is transmitted to an automated control system for casting together with the specimen number and the ladle number. An automated control system for casting correlates molten metal information corresponding to ladle numbers with analysis results for storage.
Wherein, the cooling procedure:
the cooling process is performed after the casting process. The cooling process may be performed in parallel with the sampling process and the analysis process. In the cooling step, the cooling line takes time to convey the mold, and cools the molten metal in the mold. Thereby, a casting is formed in the mold. The cooling line is controlled by the modeling PLC to act.
Wherein, the disassembly procedure:
the disassembly process is performed after the cooling process. In the disassembly step, the disassembly device separates the molds one by one to take out the castings. The dismounting device is controlled to act by the modeling PLC. The cast thus taken out is stored in a plurality of boxes, and is transported by a box transporting device in units of boxes, and is transported to a finishing device. The case transfer device is controlled by a post-processing PLC.
The post-processing PLC can acquire the mold number and the molten metal information corresponding to the mold number via an automated control system for casting. Therefore, the post-processing PLC can acquire molten metal information corresponding to the mold disassembled in the disassembly step based on the mold serial number. At the time of housing the castings in the boxes, the post-processing PLC correlates the molten metal information with the box serial number. Address information specifying a location is assigned in advance to a location where the box is movable. The post-processing PLC maintains the relation between the box serial number and the address information. For example, when the box to which the box number "1" is assigned is located at the location indicated by the address information "2", the post-processing PLC associates the box number "1" with the address information "2". The post-processing PLC updates the relationship each time the bin moves. For example, when the box to which the box number "1" is assigned moves from the position indicated by the address information "2" to the position indicated by the address information "3", the post-processing PLC24 associates the box number "1" with the address information "3". In this way, the post-processing PLC can grasp a box located at an arbitrary place by switching the box serial number on the data in accordance with the movement of the box, and can refer to the molten metal information associated with the box serial number of the box. The post-processing PLC transmits the box serial number and molten metal information corresponding to the box serial number to an automated control system for casting.
Wherein, the finish machining process:
the finishing process is performed after the disassembly process. In the finishing step, sand and stone adhering to the casting are removed by a shot blasting machine or the like, or the surface of the casting is polished by a polishing machine or the like. These devices are controlled by the post-processing PLC.
Wherein, the checking procedure comprises the following steps:
after the finishing step and the analysis step, an inspection step is performed in the inspection area. In the inspection step, the operator performs an appearance inspection of the cast. The appearance inspection means shape inspection, shakeout inspection, color inspection, size inspection, and the like performed by visual inspection. A display connected to an automated control system for casting is arranged at the inspection zone. In the inspection process, the display of the inspection area displays the analysis result of the test piece based on the display control of the automated control system for casting. An automated control system for casting determines molten metal information based on a bin serial number transferred to an inspection area, determines an analysis result associated with the molten metal information, and causes a display to display the analysis result. Thus, the operator in the inspection area can recognize at least one of the composition and physical properties of the test piece obtained in the analysis step. The product number issued by the molding machine is transferred to an automated control system for casting together with the specimen number by mold number management and box number management.
In the inspection process, the automated control system for casting can synchronize the timing of the appearance inspection of the casting with the timing of displaying the analysis result on the display. The timing of appearance inspection of the castings may be the timing at which the operator operates the operation start button, the timing at which the sensor detects that the box is carried into the inspection area, or the timing at which the sensor detects that the operator is located in the working space of the inspection area. An automated control system for casting causes a display to display the analysis results based on the timing of the appearance inspection of the casting. Thus, the timing of the appearance inspection of the castings is synchronized with the timing of displaying the analysis results on the display.
The method is characterized by comprising the steps of acquiring a surface shot image, wherein the fact that a large amount of background noise interference exists in the process of acquiring the surface shot image is considered, and the accuracy of detecting the appearance quality of the casting is affected. Also, in the appearance evaluation and classification of castings, gray scale features such as texture, flaws and defects on the casting surface should generally be more focused than color feature information. Therefore, in the technical scheme of the application, the surface shot image is further subjected to gray processing to obtain a gray surface shot image, so that background information in the image is simplified and gray features related to the surface quality of the casting are highlighted. That is, converting the surface-captured image in color into the grayscale surface-captured image can reduce complexity of image processing and can focus more on analysis and evaluation of grayscale characteristics in the image with respect to the surface quality of the casting.
According to the technical scheme, after casting of the casting is carried out, the camera is used for collecting the surface shooting image of the casting, and an image analysis algorithm is used for processing and analyzing the surface shooting image, so that the appearance quality of the casting is automatically detected.
Specifically, the image acquisition module 110 is used for acquiring a surface shooting image of the casting through a camera. In the technical scheme of the application, firstly, a surface shooting image of a casting acquired by a camera is acquired. Surface imaging can capture defects, such as cracks, pinholes, sand holes, etc., that can affect the performance and quality of the casting. The size and geometry of the casting surface can be measured by image measurement techniques, which is important to ensure that the casting size meets design requirements. The quality of the surface of the casting can be evaluated by taking an image of the surface, and indexes such as surface finish, surface roughness and the like are included, so that the appearance quality of the casting meets the requirements. By comparing the surface image of the casting with the design file, whether the appearance of the casting is consistent with the design can be verified, potential manufacturing problems or errors can be found, and corrective measures can be taken in time. Through surface shooting image analysis, whether the coating or the painting on the surface of the casting is uniform and complete or whether the problems of defects or flaking and the like exist can be detected. Texture and morphological features of the casting surface can be analyzed by surface shot image processing techniques to assess the appearance quality and process conditions of the casting.
The surface shot image can provide important information in the casting process and help to determine whether the appearance of the casting meets the preset requirement, so that the consistency of the product quality is ensured.
By taking an image of the surface of the casting, defects such as cracks, pinholes, sand holes, etc., and uneven or poor coating of the surface can be detected for the casting. By analyzing the surface image of the casting, accurate dimension measurement can be performed, which is very important to ensure that the geometric dimension of the casting meets the design requirements. By shooting the surface image of the casting, the surface quality of the casting, including the indexes of surface finish, surface roughness and the like, can be evaluated so as to ensure that the appearance quality of the casting meets the requirements. By comparing the surface image of the casting with the design file, whether the appearance of the casting is consistent with the design can be verified, potential manufacturing problems or errors can be found, and corrective measures can be taken in time.
By acquiring the information, the problems in the casting manufacturing process can be found and solved in time, and the appearance quality of the casting is ensured to meet the preset requirements.
Specifically, the image feature analysis module 120 is configured to perform image feature extraction on the surface captured image to obtain a casting surface feature map. Fig. 2 is a block diagram of the image feature analysis module in the automatic control system for casting according to the embodiment of the present application, and as shown in fig. 2, the image feature analysis module 120 includes: an image graying unit 121 for performing a gray-scale process on the surface-captured image to obtain a gray-scale surface-captured image; a casting surface local feature extraction unit 122, configured to perform local feature analysis on the gray surface captured image to obtain a casting surface image local feature map; a casting surface global feature extraction unit 123, configured to perform global analysis on the gray surface captured image to obtain a casting surface image global feature map; and the multi-scale feature fusion unit 124 is used for fusing the local feature map of the casting surface image and the global feature map of the casting surface image to obtain the casting surface feature map.
It should be appreciated that converting the surface captured image to a grayscale image helps to reduce the amount of data, simplify subsequent processing, and extract luminance information from the image.
And the local characteristic analysis is carried out on the shot image of the gray surface, so that the detail characteristics of the casting surface, such as textures, defects and the like, can be captured. By extracting the local features, the local problems on the surface of the casting can be detected and analyzed more accurately.
Global analysis of the gray surface captured image may be performed to obtain overall surface characteristics such as geometry, finish, etc. The extraction of global features helps to assess the overall quality and appearance of the casting.
The local feature map and the global feature map of the casting surface image are fused, so that the local and global features of the casting surface can be comprehensively considered, and a more comprehensive casting surface feature map is obtained. The multi-scale feature fusion is beneficial to improving the accuracy and reliability of the surface features of the casting.
The design of the image characteristic analysis module can effectively extract local and global characteristics of the surface of the casting, provides beneficial information and basis for evaluation and problem detection of the appearance quality of the casting, and is beneficial to improving the efficiency and quality control level of the casting production process.
As for the image graying unit 121, it should be understood that the gray processing is a process of converting a color image into a gray image, by which a color surface photographed image can be converted into a gray surface photographed image during casting, so as to better analyze and process the image.
A gray image is an image containing only luminance information and no color information, and represents the luminance value of a pixel using gray levels, which are typically in the range of 0 to 255, where 0 represents black and 255 represents white. By gray scale processing, the information of the three red, green and blue channels in the color image can be combined into the information of a single gray scale channel.
The surface captured image may be subjected to gray scale processing using an Average Method (Average Method), the values of the three channels of red, green, and blue of each pixel in the color image are averaged, and then the Average value is taken as the gray scale value of the pixel in the gray scale image.
The surface captured image may be gray-scale processed using a weighted average method (Weighted Average Method), wherein the values of the red, green, and blue channels of each pixel in the color image are weighted average according to a certain weight, and then the weighted average is used as the gray value of the pixel in the gray-scale image. Different weight assignments may produce different gray scale image effects.
The gray scale processing can also be performed on the surface photographed image by a Component Method (Component Method), the values of the three channels of red, green and blue in the color image are respectively used as the values of the three components of red, green and blue in the gray scale image, and then the weighted summation is performed on the three Component values to obtain the final gray scale value.
The gray scale processing can be performed on the surface photographed image by adopting a human eye sensing method (Human Perception Method), and the values of the three channels of red, green and blue in the color image can be weighted by using some empirical formulas in consideration of the sensitivity of human eyes to different colors, so that a gray scale image which is more in line with the perception of human eyes is obtained.
On the one hand, the gray processing is carried out on the surface shooting image, so that the image information can be simplified, the gray image only contains brightness information, and the color information in the color image is removed. This may simplify the complexity of the image, making subsequent image processing easier and more efficient. On the one hand, the data volume can be reduced, a color image is usually composed of pixel values of three channels of red, green and blue, and a gray image has only one channel. Therefore, the data volume of the image can be reduced to one third of the original data volume through gray processing, and the storage space and the transmission bandwidth are saved. On the other hand, brightness differences may be emphasized, and there may be various subtle brightness differences on the casting surface, which may be related to defects, surface quality, etc. The gray scale image can better highlight these brightness differences, making analysis and detection of the casting surface more accurate and reliable. In yet another aspect, image processing may be facilitated, and in image processing algorithms, many algorithms are designed and implemented based on gray scale images. By converting the surface captured image to a gray scale image, various classical image processing algorithms such as edge detection, filtering, feature extraction, etc., can be directly applied to better analyze and process the image of the casting surface.
The gray processing is carried out on the surface shot image to obtain the gray surface shot image, so that the image information can be simplified, the data volume can be reduced, the brightness difference can be emphasized, the subsequent image processing is convenient, and the method has an important effect on analysis and detection of the casting surface image.
For the casting surface local feature extraction unit 122, it is configured to: and passing the gray surface shooting image through an image local feature extractor based on a convolutional neural network model to obtain the casting surface image local feature map.
Then, an image local feature extractor based on a convolutional neural network model, which has excellent performance in terms of implicit feature extraction of images, is used for carrying out feature mining on the gray surface shot images so as to extract implicit quality feature distribution information related to the appearance of the casting surface in the gray surface shot images, thereby obtaining a casting surface image local feature map.
It should be appreciated that convolutional neural networks (Convolutional Neural Network, CNN) are a deep learning model that is specifically used to process data having a grid structure, such as images and video, and that CNN performs well in the field of image processing and is capable of automatically learning and extracting features in images.
The core component of CNN is a convolution layer (Convolutional Layer) through which feature extraction is performed on the input image. The convolution operation uses a set of learnable filters (also called convolution kernels) to locally perceive on the input image by means of a sliding window and to weight and sum the perceived features so that local features of the image, such as edges, textures, etc., can be effectively captured. In addition to the convolutional layers, the CNN includes a Pooling Layer (Fully Connected Layer) to reduce the size of the feature map and preserve the main features, and a fully connected Layer to map the extracted features to the final output class.
The image local feature extractor based on the convolutional neural network model can be used for the subsequent tasks of detecting, classifying, dividing and the like of the surface defects of the casting by inputting the gray surface shooting image into the CNN model and extracting the output feature map of the convolutional layer as the local feature map of the surface image of the casting.
For the casting surface global feature extraction unit 123, it is configured to: and carrying out feature extraction on the local feature map of the casting surface image by a feature perception enhancer based on a deep neural network model so as to obtain the global feature map of the casting surface image. The deep neural network model is a non-local neural network model.
Further, considering that convolution is a typical local operation, it can only extract image local features, but cannot pay attention to the global, and the accuracy of the appearance quality detection on the surface of the casting is affected. And for the gray surface shooting image, quality characteristic distribution information related to the appearance of the casting surface in each local area in the image is not isolated, and the correlation between the hidden characteristic information of each local image in the gray surface shooting image generates a foreground object. Therefore, in the technical scheme of the application, in order to more accurately detect the appearance quality of the casting, a non-local neural network model is used for further extracting the characteristics of the image. The casting surface image local feature map is passed through a feature perception enhancer based on a non-local neural network model to expand a feature receptive field through the non-local neural network model, so that a casting surface image global feature map is obtained. In particular, here, the non-local neural network model captures hidden dependency information by calculating the similarity between the appearance local quality features related to the casting surface in each local region of the gray surface photographed image, so as to model the context features, so that the network focuses on the global overall content between the local region features in the gray surface photographed image, and further, the main network feature extraction capability is improved in classification and detection tasks.
And for the multi-scale feature fusion unit 124, then, fusing the local feature map of the casting surface image and the global feature map of the casting surface image, so as to fuse the local associated feature information and the global associated feature distribution information related to the appearance quality of the casting in the gray surface shooting image, thereby obtaining the obtained casting surface feature map with the multi-scale casting appearance quality feature information.
Specifically, the casting appearance detection module 130 is configured to determine whether the casting appearance meets a predetermined requirement based on the casting surface feature map. Comprising the following steps: the feature distribution optimizing unit is used for carrying out feature distribution optimization on the casting surface feature map so as to obtain an optimized casting surface feature map; and the casting appearance classification unit is used for enabling the optimized casting surface feature map to pass through a classifier to obtain a classification result, and the classification result is used for indicating whether the casting appearance meets the preset requirement or not.
In particular, in the technical scheme of the application, when the casting surface image local feature map and the casting surface image global feature map are fused to obtain the casting surface feature map, considering that the casting surface image local feature map and the casting surface image global feature map respectively express the local image semantic feature and the global image semantic feature of the gray surface shooting image, there is a difference in spatial feature correlation scale of feature expression, and therefore, the feature fusion in the spatial dimension of the feature matrix is not suitable for being carried out in a manner similar to weighted point addition, and therefore, the applicant of the application considers that the feature fusion in the channel correlation dimension of the feature extraction model is carried out by cascading the casting surface image local feature map and the casting surface image global feature map along the channel dimension.
However, as the feature matrixes of the casting surface feature map are subjected to feature expression aiming at the spatial correlation features of the image semantics under different scales in the channel dimension, manifold geometric differences exist in the feature manifold expressions of the feature matrixes, so that the problem that manifold geometric continuity among features of the casting surface feature map is poor is caused, and the accuracy of classification results obtained by the classifier is affected.
The applicant of the present application thus addresses each feature matrix along the channel dimension of the casting surface profile, e.g., denoted as M i Performing channel dimension pass of feature graphsCalendar stream form convex optimization, expressed as: performing channel dimension traversing flow form convex optimization of the feature map on each feature matrix of the casting surface feature map along the channel dimension by using the following optimization formula to obtain the optimized casting surface feature map; wherein, the optimization formula is:
wherein M is i Is the characteristic matrix of the casting surface characteristic diagram along the channel dimension, V t1 [GAP(F)]And V t2 [GAP(F)]Column vectors and row vectors which are respectively obtained by linear transformation of global average pooling vectors formed by global averages of all feature matrixes of the casting surface feature map are I I.I.I.I.) 2 The spectral norms of the matrix are represented,represents vector multiplication, +.' i And (3) each characteristic matrix along the channel dimension of the optimized casting surface characteristic diagram is provided.
Here, the channel dimension traversal manifold of the casting surface feature map convex optimizes determining the base dimension of the feature matrix manifold by structuring the direction of maximum distribution density of the modulated feature matrices, and traversing the feature matrix manifold along the channel direction of the casting surface feature map to constrain each feature matrix M by stacking the base dimension of the traversal manifold along the channel direction i Convex optimization of continuity of represented traversal manifold, thereby realizing a feature matrix M 'after optimization' i The geometric continuity of the high-dimensional characteristic manifold of the casting surface characteristic diagram formed by the traversing manifold is improved, so that the accuracy of classification results obtained by the classifier is improved. Therefore, after casting of the casting, the appearance quality of the casting can be automatically detected, so that the automation level of the casting process is improved, the interference of human factors is reduced, and the quality and the production efficiency of the casting are optimized.
Further, the casting appearance classification unit includes: a matrix expansion subunit, configured to expand the optimized casting surface feature map into a classification feature vector according to a row vector or a column vector; a full-connection coding subunit, configured to perform full-connection coding on the classification feature vector by using multiple full-connection layers of the classifier to obtain a coded classification feature vector; and the classification subunit is used for passing the coding classification feature vector through a Softmax classification function of the classifier to obtain the classification result.
And further, the optimized casting surface characteristic diagram is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the appearance of the casting meets the preset requirement.
That is, the classification processing is performed with the multiscale correlated feature information concerning the appearance quality of the casting fused, so that the classification detection is performed on the quality of the appearance of the casting. Specifically, in the technical scheme of the application, the labels of the classifier comprise casting appearance meeting preset requirements (first labels) and casting appearance not meeting preset requirements (second labels), wherein the classifier determines which classification label the casting surface feature map belongs to through a soft maximum function. It should be noted that the first tag p1 and the second tag p2 do not include a manually set concept, and in fact, during the training process, the computer model does not have a concept of whether the appearance of the casting meets the predetermined requirement, which is simply that there are two kinds of classification tags and the probability that the output feature is under the two kinds of classification tags, that is, the sum of p1 and p2 is one. Therefore, the classification result of whether the appearance of the casting meets the preset requirement is actually converted into the classification probability distribution conforming to the natural rule through classifying the labels, and the physical meaning of the natural probability distribution of the labels is essentially used instead of the language text meaning of whether the appearance of the casting meets the preset requirement. It should be understood that in the technical scheme of the application, the classification label of the classifier is a detection evaluation label whether the appearance of the casting meets the preset requirement, so that after the classification result is obtained, the appearance quality of the casting can be automatically detected based on the classification result.
In summary, the automated control system 100 for casting according to the embodiment of the present application is illustrated, which collects a surface shot image of a casting by using a camera after casting the casting, and processes and analyzes the surface shot image by using an image analysis algorithm, so as to automatically detect the appearance quality of the casting.
As described above, the automated control system 100 for casting according to the embodiment of the present application may be implemented in various terminal devices, such as a server or the like for automated control of casting. In one example, the automated control system 100 for casting according to an embodiment of the present application may be integrated into the terminal device as a software module and/or hardware module. For example, the automated control system 100 for casting may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the automated control system 100 for casting may likewise be one of a number of hardware modules of the terminal device.
Alternatively, in another example, the automated control system for casting 100 and the terminal device may be separate devices, and the automated control system for casting 100 may be connected to the terminal device through a wired and/or wireless network and transmit interactive information in a agreed data format.
Fig. 3 is a flow chart of an automated control method for casting provided in an embodiment of the present application. Fig. 4 is a schematic diagram of a system architecture of an automated control method for casting provided in an embodiment of the present application. As shown in fig. 3 and 4, an automated control method for casting includes: 210, acquiring a surface shooting image of the casting through a camera; 220, extracting image features of the surface shot image to obtain a casting surface feature map; and, 230, determining whether the appearance of the casting meets the preset requirement based on the casting surface characteristic diagram.
Specifically, in the automatic control method for casting, the image feature extraction is performed on the surface photographed image to obtain a casting surface feature map, which includes: carrying out gray scale processing on the surface shooting image to obtain a gray scale surface shooting image; carrying out local feature analysis on the gray surface shooting image to obtain a casting surface image local feature map; performing global analysis on the gray surface shooting image to obtain a casting surface image global feature map; and fusing the local feature map of the casting surface image and the global feature map of the casting surface image to obtain the casting surface feature map.
It will be appreciated by those skilled in the art that the specific operation of the various steps in the above-described automated control method for casting has been described in detail in the above description of the automated control system for casting with reference to fig. 1 to 2, and thus, repetitive descriptions thereof will be omitted.
Fig. 5 is an application scenario diagram of an automated control system for casting according to an embodiment of the present application. As shown in fig. 5, in this application scenario, first, a surface-captured image (e.g., C as illustrated in fig. 5) of a casting (e.g., M as illustrated in fig. 5) is captured by a camera; the acquired surface captured images are then input into a server (e.g., S as illustrated in fig. 5) deployed with an automated control algorithm for casting, wherein the server is capable of processing the surface captured images based on the automated control algorithm for casting to determine whether the casting appearance meets predetermined requirements.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the application, and is not meant to limit the scope of the application, but to limit the application to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the application are intended to be included within the scope of the application.

Claims (10)

1. An automated control system for casting, comprising:
the image acquisition module is used for acquiring a surface shooting image of the casting through the camera;
the image feature analysis module is used for extracting image features of the surface shot image to obtain a casting surface feature map; and
and the casting appearance detection module is used for determining whether the casting appearance meets the preset requirement or not based on the casting surface feature map.
2. The automated control system for casting of claim 1, wherein the image feature analysis module comprises:
the image graying unit is used for carrying out gray processing on the surface shooting image to obtain a gray surface shooting image;
the casting surface local feature extraction unit is used for carrying out local feature analysis on the gray surface shooting image so as to obtain a casting surface image local feature map;
the casting surface global feature extraction unit is used for carrying out global analysis on the gray surface shooting image to obtain a casting surface image global feature map;
and the multi-scale feature fusion unit is used for fusing the local feature map of the casting surface image and the global feature map of the casting surface image to obtain the casting surface feature map.
3. The automated control system for casting of claim 2, wherein the casting surface local feature extraction unit is configured to: and passing the gray surface shooting image through an image local feature extractor based on a convolutional neural network model to obtain the casting surface image local feature map.
4. An automated control system for casting according to claim 3, wherein the casting surface global feature extraction unit is configured to: and carrying out feature extraction on the local feature map of the casting surface image by a feature perception enhancer based on a deep neural network model so as to obtain the global feature map of the casting surface image.
5. The automated control system for casting of claim 4, wherein the deep neural network model is a non-local neural network model.
6. The automated control system for casting of claim 5, wherein the casting appearance detection module comprises:
the feature distribution optimizing unit is used for carrying out feature distribution optimization on the casting surface feature map so as to obtain an optimized casting surface feature map;
and the casting appearance classification unit is used for enabling the optimized casting surface feature map to pass through a classifier to obtain a classification result, and the classification result is used for indicating whether the casting appearance meets the preset requirement or not.
7. The automated control system for casting of claim 6, wherein the feature distribution optimization unit is configured to: performing channel dimension traversing flow form convex optimization of the feature map on each feature matrix of the casting surface feature map along the channel dimension by using the following optimization formula to obtain the optimized casting surface feature map;
wherein, the optimization formula is:
wherein M is i Is the characteristic matrix of the casting surface characteristic diagram along the channel dimension, V t1 GAPF and V t2 [GAPF]Column vectors and row vectors respectively obtained by linear transformation of global average pooling vectors formed by global averages of all feature matrices of the casting surface feature map, 2 the spectral norms of the matrix are represented,represents vector multiplication, +.' i And (3) each characteristic matrix along the channel dimension of the optimized casting surface characteristic diagram is provided.
8. The automated control system for casting of claim 7, wherein the casting appearance classification unit comprises:
a matrix expansion subunit, configured to expand the optimized casting surface feature map into a classification feature vector according to a row vector or a column vector;
a full-connection coding subunit, configured to perform full-connection coding on the classification feature vector by using multiple full-connection layers of the classifier to obtain a coded classification feature vector; and
and the classification subunit is used for passing the coding classification feature vector through a Softmax classification function of the classifier to obtain the classification result.
9. An automated control method for casting, comprising:
collecting a surface shooting image of the casting through a camera;
extracting image features of the surface shooting image to obtain a casting surface feature map; and
and determining whether the appearance of the casting meets the preset requirement or not based on the casting surface characteristic diagram.
10. The automated casting control method of claim 9, wherein performing image feature extraction on the surface captured image to obtain a casting surface feature map comprises:
carrying out gray scale processing on the surface shooting image to obtain a gray scale surface shooting image;
carrying out local feature analysis on the gray surface shooting image to obtain a casting surface image local feature map;
performing global analysis on the gray surface shooting image to obtain a casting surface image global feature map;
and fusing the local feature map of the casting surface image and the global feature map of the casting surface image to obtain the casting surface feature map.
CN202310880918.9A 2023-07-18 2023-07-18 Automated control system for casting and method thereof Pending CN116977299A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310880918.9A CN116977299A (en) 2023-07-18 2023-07-18 Automated control system for casting and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310880918.9A CN116977299A (en) 2023-07-18 2023-07-18 Automated control system for casting and method thereof

Publications (1)

Publication Number Publication Date
CN116977299A true CN116977299A (en) 2023-10-31

Family

ID=88480794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310880918.9A Pending CN116977299A (en) 2023-07-18 2023-07-18 Automated control system for casting and method thereof

Country Status (1)

Country Link
CN (1) CN116977299A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117035669A (en) * 2023-08-14 2023-11-10 河南鑫安利职业健康科技有限公司 Enterprise safety production management method and system based on artificial intelligence
CN117348574A (en) * 2023-11-24 2024-01-05 佛山市时力涂料科技有限公司 Intelligent control system and method for paint production line
CN118321203A (en) * 2024-05-14 2024-07-12 交通运输部公路科学研究所 Robot remote control system and control method
CN119477039A (en) * 2024-06-20 2025-02-18 南通中旺包装材料有限公司 A luggage production data identification and processing system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117035669A (en) * 2023-08-14 2023-11-10 河南鑫安利职业健康科技有限公司 Enterprise safety production management method and system based on artificial intelligence
CN117348574A (en) * 2023-11-24 2024-01-05 佛山市时力涂料科技有限公司 Intelligent control system and method for paint production line
CN118321203A (en) * 2024-05-14 2024-07-12 交通运输部公路科学研究所 Robot remote control system and control method
CN119477039A (en) * 2024-06-20 2025-02-18 南通中旺包装材料有限公司 A luggage production data identification and processing system

Similar Documents

Publication Publication Date Title
CN116977299A (en) Automated control system for casting and method thereof
CN117974665B (en) Metal mold on-line detection method and equipment based on computer vision
US12254383B2 (en) Intelligent real-time defect prediction, detection, and AI driven automated correction solution
CN114387233B (en) A sand mold defect detection method and system based on machine vision
CN117392097A (en) Additive manufacturing process defect detection method and system based on improved YOLOv8 algorithm
CN104914111A (en) Strip steel surface defect on-line intelligent identification and detection system and detection method
CN108109137A (en) The Machine Vision Inspecting System and method of vehicle part
CN114898249A (en) Method, system and storage medium for confirming number of articles in shopping cart
CN117252864B (en) Steel production device smoothness detection system based on identification analysis
Zhao et al. Toward intelligent manufacturing: label characters marking and recognition method for steel products with machine vision
CN109409289A (en) A kind of electric operating safety supervision robot security job identifying method and system
CN113487538A (en) Multi-target segmentation defect detection method and device and computer storage medium thereof
CN118202383A (en) Defect detection method, defect detection system and defect detection program
CN113781432A (en) Laser scanning automatic laying on-line detection method and device based on deep learning
CN116050678A (en) Die-casting product processing test system and method based on cloud computing
CN117409005A (en) Defective product detection system and method for plate receiving machine based on image
CN115880209A (en) Surface defect detection system and method, equipment, and medium applicable to steel plates
Singh et al. A novel real-time quality control system for 3D printing: A deep learning approach using data efficient image transformers
CN115410044B (en) Machine vision-based zinc spangle rating methods, devices, terminals, and media
JP7782531B2 (en) Method for estimating chemical composition of scrap materials
CN116597364B (en) Image processing method and device
CN118195367A (en) Slab production process monitoring system based on digital twin
Chigateri et al. Recognition and classification of casting defects using the CNN algorithm
CN117058089A (en) Cigarette Appearance Detection Method
Cavaliere Comparative use of systems to detect surface defects in die-cast components using advanced vision systems applying artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination