[go: up one dir, main page]

WO2005036620A1 - Methode d'exposition, dispositif d'exposition et son procede de fabrication - Google Patents

Methode d'exposition, dispositif d'exposition et son procede de fabrication Download PDF

Info

Publication number
WO2005036620A1
WO2005036620A1 PCT/JP2004/014727 JP2004014727W WO2005036620A1 WO 2005036620 A1 WO2005036620 A1 WO 2005036620A1 JP 2004014727 W JP2004014727 W JP 2004014727W WO 2005036620 A1 WO2005036620 A1 WO 2005036620A1
Authority
WO
WIPO (PCT)
Prior art keywords
stage
exposure
correction value
scanning direction
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2004/014727
Other languages
English (en)
Japanese (ja)
Inventor
Hideyuki Hashimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2005514576A priority Critical patent/JPWO2005036620A1/ja
Publication of WO2005036620A1 publication Critical patent/WO2005036620A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F9/00Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
    • G03F9/70Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
    • G03F9/7092Signal processing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70691Handling of masks or workpieces
    • G03F7/70716Stages
    • G03F7/70725Stages control

Definitions

  • Exposure method Exposure method, exposure apparatus, and device manufacturing method
  • the present invention relates to an exposure method, an exposure apparatus, and a device manufacturing method, and more particularly, to a lithographic apparatus for manufacturing an electronic device such as a semiconductor element and a liquid crystal display element.
  • the present invention relates to an exposure method and an exposure apparatus, and a device manufacturing method using the exposure method in a lithographic process.
  • a mask on which a circuit pattern has been drawn using an ultraviolet pulse laser beam having a wavelength of 248 nm from a KrF excimer laser or an ultraviolet pulse laser beam having a wavelength of 193 nm from an ArF excimer laser as illumination light is drawn using an ultraviolet pulse laser beam having a wavelength of 248 nm from a KrF excimer laser or an ultraviolet pulse laser beam having a wavelength of 193 nm from an ArF excimer laser as illumination light.
  • a reticle hereinafter collectively referred to as a “reticle”
  • a wafer as a photosensitive object are one-dimensionally scanned relative to a projection field of view of a reduction projection optical system, so that a reticle is formed within one shot area on the wafer.
  • a step-and-scan type scanning exposure apparatus (also called a scanner or a scanning stepper), which repeats a scanning exposure operation for transferring the entire circuit pattern and a stepping operation between shot areas, is becoming mainstream.
  • this type of scanning 'stepper it is possible to mass produce circuit devices with a density of 256M (mega) bits D-RAM and a minimum line width of 0.25 ⁇ m. Further, an exposure apparatus for mass-producing the next generation circuit device of 1 G (giga) bits or more is being developed.
  • a reticle (reticle stage) and a wafer (wafer stage) are moved in a scanning direction while maintaining a speed ratio corresponding to the projection magnification of a projection optical system. Exposure is performed while moving synchronously, and the pattern formed on the reticle is transferred to a plurality of shot areas on the wafer.
  • a dynamic factor such as a synchronization error between a reticle stage and a wafer stage during scanning exposure causes a positional shift (or distortion) of a pattern image transferred onto a wafer and a change in resolution. It becomes a factor such as deterioration. Therefore, in a scanning exposure apparatus, it is desirable to minimize the synchronization error between the two stages during scanning exposure as much as possible.
  • Various proposals have been made to reduce the period error (for example, see Patent Document 1).
  • iterative learning control (Iterative Learning Control) has recently attracted attention as a technique that can reduce the synchronization error between the reticle stage and the wafer stage of a scanning exposure apparatus.
  • a scanning exposure apparatus when sequentially transferring a reticle pattern to a plurality of shot areas on a wafer, in order to improve throughput, the reticle is usually alternately scanned (reciprocally scanned) so that the next shot is sequentially performed. The region is exposed. For this reason, after the transfer of the reticle pattern to one shot area is completed, at the time of prescanning before the start of exposure (acceleration time up to the target speed (scanning speed at the time of exposure) + after the end of acceleration, the speed is within a predetermined error range).
  • An operation is required in which the reticle is further moved from the end of exposure by the same distance as the movement distance (settling time until the target speed converges) to return the reticle to the scan start position for the next shot area exposure.
  • a movement operation between shot areas including a movement in the scanning direction is required.
  • Patent Document 1 JP-A-10-270343
  • a first object of the present invention is to reduce a synchronization error between a mask unique to each exposure apparatus and a photosensitive object, and form the mask on a mask. It is an object of the present invention to provide an exposure method and an exposure apparatus capable of accurately transferring a pattern on a photosensitive object.
  • a second object of the present invention is to transfer a pattern to each partitioned area on a photosensitive object with high precision while suppressing displacement of the object stage in the non-scanning direction due to movement between the partitioned areas.
  • An object of the present invention is to provide an exposure method that enables the above.
  • a third object of the present invention is to provide an exposure method and an exposure apparatus that can control an object stage according to operating conditions while correcting the position of the object stage.
  • a fourth object of the present invention is to provide an exposure apparatus that can improve the effect of the repetitive learning control when the repetitive learning control is applied to an object stage, a mask stage, or the like.
  • a mask stage for holding a mask and an object stage for holding a photosensitive object are synchronously moved in a predetermined scanning direction, and a pattern formed on the mask is formed. Is transferred onto the photosensitive object, wherein the mask stage and the mask stage are considered in consideration of a first correction value group that makes a position deviation, which is a difference between a target position of the object stage and its current position, gradually approach zero. While synchronously moving the object stage, a second group of correction values for repeating a position deviation, which is a difference between a target position of the mask stage according to a current position of the object stage and the current position, to zero, is repeated.
  • a first exposure method comprising; synchronously moving the object stage to the scanning direction, the pattern formed on the mask and the second step of transferring onto the photosensitive object.
  • a first correction value group for ascending a position deviation which is a difference between a target position of the object stage and its current position, to zero (this is an experimental value
  • the mask stage and the object stage are determined in consideration of A second group of correction values for ascending the position deviation, which is the difference between the target position of the mask stage corresponding to the current position of the object stage and the current position, to zero while repeating the synchronous movement of 1 step).
  • the first correction value group and the second correction value group are used V to correct the positions of the object stage and the mask stage, respectively.
  • the pattern formed on the mask is transferred onto the photosensitive object (second step). That is, at the time of scanning exposure, a first correction value group for ascending the position deviation, which is the difference between the target position of the object stage and its current position, to zero, and the target position of the mask stage corresponding to the current position of the object stage. Since the positions of the object stage and the mask stage are corrected using the second correction value group for ascending the position deviation, which is the difference from the current position, to zero, the synchronization error between the two stages is effectively reduced.
  • the scanning exposure is performed in a state where the temperature is reduced.
  • the second correction value group for making the tracking error of the mask stage with respect to the object stage asymptotic to zero is obtained by the repetitive learning control performed in advance, the exposure method of the present invention
  • the synchronization error between the mask and the photosensitive object, which is specific to the exposure apparatus to be applied, can be reliably reduced, and the pattern formed on the mask can be accurately transferred onto the photosensitive object.
  • a third step of repeatedly obtaining the first correction value group by learning control while moving the object stage in the same manner as the first step is further described. Can be included.
  • repetitive learning control for obtaining the first group of correction values may be performed in parallel during the synchronous movement. That is, prior to exposure, not only the second correction value group but also the first correction value group can be repeatedly acquired by learning control while the mask stage and the object stage are synchronously moved.
  • an object stage holding a photosensitive object is moved in a predetermined scanning direction, and a plurality of divided areas on the photosensitive object are exposed to each other, and An exposure method for forming a pattern, similar to a moving operation between divided areas performed between an exposure for one divided area and an exposure for the next divided area prior to an actual exposure operation.
  • An exposure method for forming a pattern similar to a moving operation between divided areas performed between an exposure for one divided area and an exposure for the next divided area prior to an actual exposure operation.
  • the scanning direction and the scanning direction are set in the same manner as during the movement operation between the divided areas performed between the exposure for one divided area and the exposure for the next divided area.
  • the position deviation which is the difference between the target position of the object stage in the non-scanning direction and its current position, is asymptotically reduced to zero.
  • the first correction value group is obtained by iterative learning control (first step).
  • the first correction value group obtained in this case is different from the correction value group obtained by repetitive learning control in which the object stage is moved only in the non-scanning direction, and the movement of the object stage in the scanning direction is in the non-scanning direction.
  • the effect on the movement of the object stage is taken into account as a result, and a correction value group closer to the movement operation between the actual partitioned areas of the object stage is obtained.
  • the first correction value group is obtained by the repetitive learning control, the object stage caused by the movement between the divided areas of the object stage unique to the exposure apparatus to which the exposure method of the present invention is applied is applied. Position errors in the non-scanning direction can be reliably reduced.
  • the object stage is moved in the scanning direction before and after the movement operation between the divided areas to perform exposure to form a pattern in each divided area on the photosensitive object (second step). Therefore, when the movement operation of the object stage between the divided areas ends and the exposure to the next divided area is started, the displacement of the object stage in the non-scanning direction due to the movement operation between the divided areas is almost complete. Correction is surely performed, and exposure is performed in this state.
  • a position deviation which is a difference between a target position of the object stage in the scanning direction and a current position thereof.
  • the position of the object stage in the scanning direction can be corrected in consideration of a second group of correction values that makes the value of the object stage approach zero.
  • an experiment or the like for obtaining the second correction value group may be performed prior to the first step.
  • a third step of repeatedly obtaining the second correction value group by learning control while moving the object stage in the scanning direction in the same manner as when moving in the scanning direction is further included.
  • an object stage holding a photosensitive object is moved in a predetermined scanning direction, and a plurality of divided areas on the photosensitive object are respectively exposed and a predetermined
  • An exposure method for forming a pattern comprising: a correction value corresponding to an operating condition selected from a plurality of correction value groups for ascending a position deviation, which is a difference between a target position and a current position of the object stage, to zero.
  • a third exposure method including an exposure step of controlling the object stage according to the operating condition while correcting the position of the object stage based on a group.
  • an operation condition selected from a plurality of correction value groups for ascending a position deviation, which is a difference between a target position and a current position of the object stage, to zero is considered.
  • the object stage is controlled in accordance with the operating conditions while correcting the position of the object stage based on the corresponding correction value group. That is, a plurality of operating conditions are usually set as operating conditions of the object stage in the exposure process, but the difference between the target position and the current position of the object stage for each of the plurality of operating conditions.
  • a plurality of correction value groups for ascending the position deviation to zero are obtained in advance by experiment or iterative learning control, and in the actual exposure process, a plurality of correction value groups corresponding to the operating conditions at that time are obtained.
  • the object stage is controlled according to the operating conditions while correcting the position of the object stage based on the correction value group. Therefore, it is possible to control the object stage according to the operating conditions while correcting the position of the object stage regardless of the operating conditions.
  • the plurality of correction value groups are obtained by continuously operating the operation pattern of the object stage when continuously exposing a plurality of partitioned areas on the same row and a plurality of partitioned areas on different rows.
  • the operation pattern of the object stage at the time of exposure may include a correction value group individually corresponding to the operation pattern, or the plurality of correction value groups may include a plurality of operation sequences of the object stage. May be included.
  • the plurality of correction value groups may include a scanning operation and a step operation of the object stage in the scanning direction, and a step operation of a non-scanning direction orthogonal to the scanning direction. And a correction value group individually corresponding to the above.
  • Each of the correction value groups may be determined in advance by performing an experiment or the like for each operating condition. Each of them can be obtained by learning control.
  • the predetermined pattern is formed on a mask, and at the time of exposure, the mask stage holding the mask and the object stage perform predetermined scanning.
  • the predetermined pattern is transferred to each of a plurality of divided areas on the photosensitive object by synchronously moving in the directions. Further, in the exposing step, a positional deviation which is a difference between a target position and a current position of the mask stage is calculated. The position of the mask stage may be corrected based on a group of correction values approaching zero.
  • the group of correction values for correcting the position of the mask stage is a force that can be obtained in advance by an experiment or the like.
  • the group of correction values for correcting the position of the mask stage. Is a TV that is acquired by the repetitive learning control performed in advance.
  • the two stages are stopped in the scanning direction between the exposure of one partitioned area and the exposure of the next partitioned area as the operating condition. Is set, and the acceleration of the two stages can be started after a lapse of a predetermined time from the end of the exposure of the one partitioned area.
  • a mask and a photosensitive object are synchronously moved in a predetermined scanning direction, and a pattern formed on the mask is placed on the photosensitive object.
  • An exposure apparatus for transferring said mask stage being capable of mounting said mask and being movable at least in said scanning direction; and being capable of mounting said photosensitive object and being movable at least in said scanning direction.
  • An object stage control system including a first learning controller obtained by iterative learning; a second control system controlling the mask stage according to a positional deviation that is a difference between its target position and a current position; The position A second learning controller for repeatedly acquiring a second group of correction values for ascending the difference to zero, wherein a command value corresponding to a current position of the object stage is set as the target position during the synchronous movement.
  • a given mask stage control system; according to setting conditions, the first and second learning controllers are set to a connected state or a non-connected state with respect to a corresponding control system, and connected to a corresponding control system.
  • the correction value group obtained by the specific learning controller set in the state is sequentially stored, and the control system in which the corresponding learning controller is disconnected is obtained in advance by the corresponding learning controller. And a control device for sequentially inputting a corresponding correction value group as the correction value of the position deviation.
  • the first learning controller is disconnected from the object stage control system, and the second learning controller is connected to the mask stage control system.
  • the controller performs the setting according to the setting conditions, and the object stage control system and the mask stage control system move the mask stage and the object stage synchronously, that is, the object stage of the mask stage.
  • the tracking control is performed for.
  • the corresponding correction value group (first correction value group) obtained in advance by the first learning controller is sequentially input to the object stage control system as the correction value of the position deviation by the control device.
  • the object stage control system uses the first correction value group, the object stage control system performs position correction of the object stage such that the position deviation of the object stage gradually approaches zero.
  • the second correction controller acquires the second correction acquired by the second learning controller by repetitive learning control.
  • the value group is stored.
  • setting conditions were set such that the first learning controller was connected to the object stage control system and the second learning controller was connected to the mask stage control system.
  • the control device performs setting according to the setting conditions, and the object stage control system and the mask stage control system perform synchronous movement of the mask stage and the object stage, that is, follow-up control of the mask stage with respect to the object stage. Be done.
  • a first correction value group obtained by the first learning controller by the repetitive learning control and a second correction value group obtained by the second learning controller by the repetitive learning control are obtained. The group is stored.
  • a setting condition is set such that the first learning controller is disconnected from the object stage control system and the second learning controller is disconnected from the mask stage control system.
  • the control device makes settings in accordance with the set conditions, and the object stage control system and the mask stage control system perform synchronous movement of the mask stage and the object stage.
  • the pattern is transferred onto the photosensitive object.
  • a corresponding correction value group previously acquired by the first learning controller (the first correction value group acquired in advance or the stored first correction value group) is sent to the object stage control system by the control device.
  • the mask stage control system stores the corresponding correction value group previously acquired by the second learning controller as the corresponding correction value group.
  • the second group of correction values is sequentially input as correction values of the position deviation. This allows the object stage control system to perform the position correction of the object stage such that the position deviation of the object stage gradually approaches zero by using the first correction value group.
  • the position correction of the mask stage is performed so that the position deviation of the mask stage gradually approaches zero using the correction value group of 2.
  • scanning exposure is performed with the synchronization error between both stages being effectively reduced, and the first and second correction value groups are also determined in advance.
  • the setting conditions include a first condition for connecting the first learning controller to the object stage control system and a second condition for connecting the second learning controller to the mask stage control system.
  • the two conditions can be settable.
  • the setting conditions include a first condition for connecting the first and second learning controllers to the object stage control system and the mask stage control system, respectively, and the first and second learning controllers.
  • the second condition for disconnecting the object stage control system and the mask stage control system can be set.
  • an exposure method for moving a photosensitive object in a predetermined scanning direction and exposing a plurality of divided areas on the photosensitive object to form a predetermined pattern in each divided area An object stage on which the photosensitive object can be placed and which can be moved in a two-dimensional direction including the scanning direction and a non-scanning direction orthogonal to the scanning direction; and an object stage that controls the object stage and The position of the object stage is determined based on a correction value group corresponding to an operation condition selected from a plurality of correction value groups for ascending a position deviation, which is a difference between the target position and the current position, to zero. And a stage control system for performing correction.
  • the position of the object stage is corrected based on the correction value group corresponding to the condition. That is, a plurality of operating conditions are usually set as the operating conditions of the object stage. For each of the plurality of operating conditions, the position deviation, which is the difference between the target position of the object stage and the current position, is set to zero.
  • a correction value group corresponding to the operating condition is selected from a plurality of correction value groups, and the object stage is controlled according to the operating condition while correcting the position of the object stage based on the correction value group. Therefore, depending on the operating conditions Instead, it is possible to control the object stage according to the operating conditions while correcting the position of the object stage.
  • the plurality of correction value groups are obtained by continuously operating the operation pattern of the object stage when continuously exposing a plurality of partitioned areas on the same row and a plurality of partitioned areas on different rows.
  • a correction value group individually corresponding to the operation pattern of the object stage at the time of exposure may be included, or the correction value group may individually correspond to a plurality of operation sequences of the object stage. May be included.
  • each of the correction value groups is individually set for each of a scanning operation and a step operation in the scanning direction of the object stage and a step operation in the non-scanning direction.
  • a corresponding group of correction values may be included.
  • Each of the correction value groups may be obtained by repetitive learning control performed in advance for each operating condition.
  • the stage control system includes one section.
  • the exposure of the one partitioned region is performed. After a lapse of a fixed time from the end, the acceleration of the two stages may be started.
  • the stage control system is based on a correction value group for ascending a position deviation, which is a difference between a target position and a current position of the mask stage, to zero.
  • the position of the mask stage at the time of synchronous movement with the object stage can be further corrected.
  • the group of correction values for correcting the position of the mask stage may have been obtained by repetitive learning control performed in advance.
  • the stage control system includes a movement sequence of the two stages in which the two stages stop and stop in the scanning direction between the exposure of one partitioned area and the exposure of the next partitioned area.
  • the acceleration of the two stages may be started after a lapse of a fixed time from the end of the exposure of the one partitioned area.
  • the mask and the photosensitive object are synchronously moved in a predetermined scanning direction, and the pattern formed on the mask is moved to a plurality of partitioned areas on the photosensitive object.
  • An exposure apparatus for transferring and transferring wherein a mask stage capable of mounting the mask and movable at least in the scanning direction; and capable of mounting the photosensitive object, the scanning direction and orthogonal to the scanning direction.
  • An object stage movable in a two-dimensional direction in a non-scanning direction; and controlling the two stages; and controlling the two stages with respect to the scanning direction between exposure of one segmented region and exposure of the next segmented region.
  • a stage control system that starts acceleration of the two stages after a lapse of a predetermined time from the end of exposure of the one partitioned area when executing the movement sequence of the two stages in which the two stages are stopped. This is the third exposure system.
  • a stage control system for controlling the mask stage and the object stage is capable of performing both stages in the scanning direction between the exposure of one partitioned area and the exposure of the next partitioned area.
  • the present invention is a device manufacturing method using a shift of the first to third exposure methods of the present invention, in view of still another viewpoint.
  • FIG. 1 is a view schematically showing a configuration of an exposure apparatus according to an embodiment of the present invention.
  • FIG. 2 (A) is a plan view showing a reticle stage.
  • FIG. 2 (B) is a plan view showing a wafer stage.
  • FIG. 3 is a block diagram showing a stage control system of the exposure apparatus of the embodiment.
  • FIG. 4 (A) is a plan view showing a relationship between a slit-shaped illumination area and a shot area S on a wafer inscribed in an effective field of the projection optical system.
  • FIG. 4 (B) is a diagram showing the relationship between stage movement time and stage speed.
  • FIG. 5 is a flowchart showing a processing algorithm of main controller 50 in FIG. 1.
  • FIG. 6 is a view schematically showing a movement locus of an illumination slit center when exposing a plurality of shot areas on a wafer W by the exposure apparatus of the embodiment.
  • FIG. 7 Center P of illumination slit ST on wafer when shot areas S 1, S 2, and S 3 are sequentially exposed
  • FIG. 3 is a diagram showing a trajectory passing over each shot.
  • FIG. 8 is a diagram showing a speed curve of a wafer stage in a movement operation in a first mode.
  • FIG. 9 is a diagram showing a speed curve of a wafer stage in a movement operation in a second mode.
  • FIG. 10 is a diagram showing a speed curve of a wafer stage in a movement operation in a third mode.
  • FIG. 11 is a flowchart illustrating an embodiment of a device manufacturing method.
  • FIG. 12 is a flowchart showing a specific example of step 204 in FIG. 11.
  • FIG. 1 schematically shows an entire configuration of an exposure apparatus 10 according to an embodiment of the present invention.
  • the exposure apparatus 10 is a projection exposure apparatus that performs an exposure operation by a step-and-scan method, which is currently becoming the mainstream as a lithography apparatus for manufacturing semiconductor elements.
  • the exposure apparatus 10 projects an image of a part of a circuit pattern formed on a reticle R serving as a mask onto a wafer W serving as a photosensitive object via a projection optical system PL, and combines the reticle R and the wafer W.
  • a projection optical system PL By scanning relative to the field of view of the projection optical system PL in a one-dimensional direction (here, the Y-axis direction which is the horizontal direction in FIG. 1), the entire circuit pattern of the reticle R is placed on the wafer W.
  • shots a plurality of shot areas
  • the exposure apparatus 10 includes an illumination unit ILU, a reticle stage RST as a mask stage, a projection optical system PL, a wafer stage WST as an object stage, and a control thereof. System.
  • the illumination unit ILU includes, for example, a BMU (beam matching unit) and a light source for exposure (not shown) including a pulse laser light source such as a KrF excimer laser having an output wavelength of 248 nm or an ArF excimer laser having an output wavelength of 193 nm. It is connected via a light-sending optical system that includes an optical system for adjusting the optical axis, which is called as a part.
  • a pulse laser light source such as a KrF excimer laser having an output wavelength of 248 nm or an ArF excimer laser having an output wavelength of 193 nm.
  • pulsed laser light in the ultraviolet region (hereinafter also referred to as “excimer laser light”, “pulse illumination light” or “pulse ultraviolet light”) from the light source is used as illumination light for exposure (hereinafter referred to as appropriate).
  • Illumination light ” IL is used for mass production of microcircuit devices with the integration degree and fineness equivalent to semiconductor memory elements (D-RAM) of 256M (mega) to 4G (giga) bit class or higher. This is to obtain a pattern resolution of about 0.25 to 0.10 m, the minimum line width required for manufacturing. Therefore, pulsed laser light in the vacuum ultraviolet range such as F laser is output as the light source.
  • a laser light source may be used.
  • the illumination unit ILU includes an illumination system housing, an illuminance uniforming optical system including an optical integrator housed in a predetermined positional relationship inside the housing, a relay lens, a variable ND filter, and a variable field stop (reticle blind). Or, it is also called a masking blade), and an illumination optical system including a dichroic mirror and the like (the deviation is not shown).
  • an illumination optical system similar to that of the present embodiment is disclosed in, for example, Japanese Patent Application Laid-Open No. 2001-313250 and US Patent Application Publication No. 2003Z0025890 corresponding thereto.
  • the illumination optical system may be configured similarly to the illumination optical system disclosed in, for example, 534,970.
  • a fly-eye lens, an internal reflection type integrator (such as a rod integrator), or a diffractive optical element is used as the optical integrator.
  • an internal reflection type integrator such as a rod integrator
  • a diffractive optical element is used as the optical integrator.
  • a slit-shaped illumination area (a rectangular illumination area elongated in the X-axis direction) RA (FIG. 2 (A ) Is illuminated with illumination light IL with substantially uniform illuminance.
  • the reticle stage RST is arranged below the illumination unit ILU, as shown in FIG.
  • the reticle stage RST is actually driven by a reticle drive system 29 including an actuator such as a linear motor, and is linearly driven on a reticle base surface plate (not shown) with a large stroke in the Y-axis direction.
  • the micro drive is performed in the axial direction, the Y axis direction, and the ⁇ z direction (the rotation direction around the Z axis).
  • the reticle R is sucked and held on the reticle stage RST.
  • a movable mirror 31 that reflects a laser beam from a reticle laser interferometer (hereinafter abbreviated as "reticle interferometer") 30 is fixed to reticle stage RST. Is always detected by the reticle interferometer 30 with a resolution of, for example, about 0.5 to lnm.
  • an X-axis movable mirror 31x extending in the Y-axis direction is fixed to the + X side end of reticle stage RST. Also, at the end in the + Y direction, two Y-axis movable mirrors 31y, 31y each composed of a retro-reflector are fixed, respectively.
  • the former movable mirror 31x is irradiated with a laser beam LR parallel to the X axis, and the latter is moved.
  • Laser beams LR and LR are irradiated on the mirrors 31y and 31y, respectively, parallel to the Y axis.
  • the laser beams LR, LR, LR are supplied from the reticle interferometer 30 in FIG.
  • the mirrors LR and LR are reflected back by the reflection mirrors 39A and 39B, respectively. That is,
  • the Y-axis interferometer for the reticle is a double-pass interferometer, so that even if the reticle stage RST rotates, the laser beam will not be displaced.
  • reference numeral RA denotes a slit-shaped illumination area on the reticle R.
  • the reticle stage RST as described above has an X-axis movable mirror 31 ⁇ and two ⁇ -axis movable mirrors 31y, 31y fixed thereto, and correspondingly, the reticle interferometer 30 also has a three-axis laser interference.
  • FIG. 1 these are representatively shown as a moving mirror 31 and a reticle interferometer 30.
  • the end surface of reticle stage RST may be mirror-finished to form a reflection surface (for example, equivalent to the reflection surface of moving mirror 3 lx)! ,.
  • the output of the reticle interferometer 30 is supplied to the reticle stage control unit 33, the synchronous control unit 80 as a control device, and the main control device 50 via the same.
  • the reticle stage control unit 33 basically controls the reticle drive system 29 that drives the reticle stage RST such that the position information output from the reticle interferometer 30 matches the command value (target position).
  • both the object plane (reticle R) side and the image plane (wafer W) side are telecentric and have a circular projection visual field, and quartz or fluorite is used as an optical glass material.
  • a 1Z4 (or 1Z5) refraction optical system consisting of only a refractive optical element (lens element) is used.
  • the direction of the optical axis AX of the projection optical system PL is the Z-axis direction.
  • the imaging luminous flux from the portion of the circuit pattern area on the reticle R illuminated by the pulsed ultraviolet light Projecting optical system PL, the resist on the wafer W attracted onto the wafer stage WST described later The image is projected on the layer reduced to 1Z4 or 1Z5.
  • the projection optical system PL includes a refractive optical element and a reflective optical element (concave mirror and beam).
  • a catadioptric system combined with a splitter, etc. may be used.
  • the wafer stage WST is levitated and supported on a base (not shown) through a predetermined clearance by a gas static pressure bearing (not shown) provided on the bottom surface, and includes a wafer drive system including an actuator such as a linear motor. 48 drives freely in the X-axis and Y-axis directions. At the same time, they are minutely driven in the Z-axis direction, 0z direction, 0x direction (rotation direction around the X-axis), and 0y direction (rotation direction around the Y-axis).
  • the wafer driving system 48 is controlled by a wafer stage control unit 78.
  • a wafer holder (not shown) having a substantially circular shape is provided on wafer stage WST, and wafer W is electrostatically attracted to this wafer holder, and is flattened and held.
  • the temperature of the wafer holder is controlled in order to suppress expansion and deformation due to heat accumulation during exposure of the wafer w.
  • a focus (not shown in Fig. 1) for detecting a deviation (focus error) and an inclination (leveling error) in the Z-axis direction between the imaging plane of the force projection optical system PL and the surface of the wafer W.
  • 'A leveling detection system is provided near the projection optical system PL, and the wafer stage control unit 78 responds to the focus error signal from the focus' leveling detection system ⁇ ⁇ the leveling error signal to the wafer drive system 48. Outputs a drive command.
  • An example of such a focus' leveling detection system is disclosed in detail in Japanese Patent Application Laid-Open No. 7-201699.
  • the output of the focus leveling detection system is also supplied to the synchronous control unit 80 via the wafer stage control unit 78 and the main controller 50 via the synchronous control unit 80.
  • the position of the wafer stage WST is sequentially measured by the laser interferometer system 76.
  • each of the Y-side and X-side end faces of the wafer stage WST is mirror-finished to form a reflection surface.
  • the X-side reflection surface of the wafer stage WST is irradiated with two laser beams LWX and LWX at an interval D along an optical path parallel to the X-axis, respectively. I have.
  • the laser beams LWX and LWX are the same from the X axis passing through the optical axis AX of the projection optical system PL.
  • the Y-side reflecting surface of wafer stage WST is irradiated with two laser beams LWY and LWY at an interval D along an optical path parallel to the Y-axis, respectively.
  • the laser beams LWX, LWX, LWY, and LWY are respectively the laser interferometer system of FIG. It is supplied from the interferometer that makes up the system 76.
  • the output of the four-axis laser interferometer system 76 is supplied to the wafer stage control unit 78, the synchronous control unit 80, and the main controller 50 via the same.
  • Synchronous control unit 80 has two X-axis interferometers that use laser beams LWX and LWX as measuring axes.
  • the X position of the wafer stage WST is measured based on the average value of the outputs (WX, WX), and the
  • the rotation angle of the wafer stage WST in the XY plane is calculated based on the difference between the output of the interferometer and the interval D).
  • a plurality of laser interferometers on the wafer side are provided. In FIG. 1, these are typically shown as a laser interferometer system 76. It should be noted that a moving mirror having a plane mirror force may be provided instead of each of the above-described reflecting surfaces formed on wafer stage WST.
  • a reference mark plate FM On the wafer stage WST, there is provided a reference mark plate FM whose surface is made substantially the same height as the surface of the wafer W.
  • Various fiducial marks are formed on the surface of this fiducial mark plate FM. These fiducial marks are used to check the detection center point of each alignment detection system (calibration), and to check the detection center point and the projection optical system during projection. It is used for measuring the distance from the center (base line), checking the position of the reticle R with respect to the wafer coordinate system, or checking the position in the Z direction of the best imaging plane conjugate to the reticle R pattern surface.
  • a synchronous control unit 80 for synchronously moving reticle stage RST and wafer stage WST is provided in the control system.
  • the synchronous control unit 80 controls the reticle drive system 29 by the reticle stage control unit 33 and the wafer drive system by the wafer stage control unit 78 when moving the reticle stage RST and the wafer stage WST synchronously, especially during scanning exposure.
  • the position information of the reticle R and the position of the wafer W measured by the reticle interferometer 30 and the laser interferometer system 76 are monitored in real time, and their mutual relationship is monitored. Is managed to be a predetermined one.
  • the synchronous control unit 80 is controlled by various commands and parameter setting information from the main controller 50. As described above, in the present embodiment, the stage control system for controlling both stages RST and WST by the synchronous control unit 80, the reticle stage control unit 33, and the wafer stage control unit 78 (this will be further described later). ) Is composed.
  • the control system is actually a unit of the above-described light source and each unit of the exposure apparatus main body (illumination unit ILU, reticle stage RST, wafer stage WST, wafer transport system, and the like). It is constructed as a distributed system comprising a plurality of unit-side computers (microprocessors and the like) for individually controlling each of the units, and a powerful main controller 50 such as a workstation for controlling these unit-side computers collectively. Is
  • the plurality of unit-side computers cooperate with the main controller 50 to execute a series of exposure processing on a plurality of wafers.
  • the entire sequence of the series of exposure processing is controlled by the main controller 50 in accordance with a setting file of a predetermined exposure condition called a process program stored in a memory (not shown).
  • the process program is based on the exposure processing file name created by the operator, and includes information on the wafer to be exposed (the number of processed wafers, shot area size, shot area array data, alignment mark arrangement data, alignment conditions, etc.). ), Information about the reticle to be used (pattern type data, arrangement data of each mark, size of the circuit pattern area, etc.), and information about exposure conditions (exposure amount, focus offset amount, scan speed offset amount, projection magnification) Offset amount, correction amount of various aberrations and image distortion, numerical aperture of illumination optical system ⁇ set value of coherence factor ⁇ value, etc., numerical aperture set value of projection optical system, etc.) are stored as a package of parameters. It is.
  • Main controller 50 decodes the process program instructed to be executed, and sequentially instructs the corresponding unit-side computer as a command on the operation of each component required for the wafer exposure processing. At this time, when each unit computer completes one command normally, it sends a status to that effect to the main controller 50, and the main controller 50 receives the status. Sends the following command to the unit-side computer.
  • FIG. 3 shows a block diagram of the stage control system 90 of the scanning exposure apparatus 10 according to the present embodiment.
  • the stage control system 90 includes a synchronous control unit 80, a wafer stage control system 92, and a reticle stage control system 94.
  • the wafer stage control system 92 performs the following based on the position command value P for the wafer stage WST output from the synchronous control unit 80 in response to an instruction from the main controller 50.
  • the wafer stage control system 92 calculates the difference between the position command value P and the position of the wafer stage WST.
  • a subtractor 52 for calculating a certain position deviation an adder 54 provided at an output stage of the subtractor 52, and a PID controller for performing, for example, a (proportional + integral + differential) control operation using the output signal of the adder 54 as an operation signal.
  • a PI controller that performs a (proportional + integral) control operation includes a wafer stage controller 56 that also generates power, and a wafer stage system as a control target Wp.
  • a first control system that controls the wafer stage WST according to the positional deviation is configured by the wafer stage controller 56.
  • the controlled object Wp is directly driven by the force wafer driving system 48, the wafer stage WST, and the position information of the wafer stage WST is transmitted by the laser interferometer system 76. The position is measured, and the measured position information is fed back to the subtractor 52 to form a position control loop.
  • the control target Wp is substantially a wafer stage system including the wafer drive system 48 and the wafer stage WST. Therefore, hereinafter, it is described as a wafer stage system Wp (this transfer function is Wp).
  • An output terminal of the subtractor 52 constituting the wafer stage control system 92 is connected to an input terminal of an ILC controller 58 as a first learning controller via a circuit opening / closing switch SW1, and an output of the ILC controller 58 is provided.
  • the terminal is connected to the first input terminal of the adder 54.
  • the output terminal of the subtractor 52 is connected to the second input terminal of the adder 54. Therefore, when the switch SW1 is ON, the adder 54 outputs an operation signal obtained by adding the position deviation ⁇ Pw output from the subtractor 52 and the correction value output from the ILC controller 58 to the wafer stage controller. Output to 56.
  • the correction value output from the ILC controller 58 is also output to the synchronous control unit 80.
  • the ILC controller 58 will be briefly described.
  • This ILC controller 58 can be configured to include an ILC integrator and an ILC compensator connected to the output stage of the ILC integrator.
  • the ILC integrator is a memory that stores a tracking error, and the transfer function of the ILC integrator is represented by z-i / d-z-, where Z is the z-transformation operator.
  • the tracking (following) error that occurs in each iteration is added to the value in the memory each time.
  • the ILC compensator can be configured to include, for example, a learning gain, a filter, and the like.
  • the learning gain K is a parameter used for adjusting the convergence speed and the convergence stability of the learning.
  • the magnitude of the learning gain K is defined as the transfer function of the ILC compensator Gc and the transfer function of the control target of the ILC controller 58 (ILC controller Assuming that the closed-loop transfer function of the feedback control system excluding 58 is Gp, it is set to satisfy
  • the transfer function of the wafer stage controller is Wc
  • Wc Wc ⁇ Wp / (l + Wc′Wp).
  • the filter is used for fine adjustment of phase characteristics and gain characteristics, noise removal, and the like.
  • This filter has, for example, an inverse system having an inverse transfer function to the product of the transfer function of the wafer stage controller and the transfer function of the wafer stage system, and a filter for removing high-frequency components output from the inverse system.
  • a combination with a low-pass filter can be applied.
  • a negative dead time element may be used to improve the convergence stability of learning.
  • the reticle stage control system 94 determines the position of reticle stage RST based on position command value P for reticle stage RST output from synchronous control unit 80.
  • the reticle stage control system 94 calculates a position deviation ⁇ ⁇ which is a difference between the position command value P and the position of the reticle stage RST.
  • a subtractor 62 an adder 64 provided at the output stage of the subtractor 62, and a PID controller (A) which performs, for example, a (proportional + integral + differential) control operation using the output signal of the adder 64 as an operation signal.
  • a (PI controller that performs (proportional + integral) control operation) a reticle stage controller 66 that also configures the force, and a reticle stage system as the controlled object Rp.
  • a reticle stage controller 66 configures a second control system that controls the reticle stage RST according to its position deviation.
  • reticle stage RST is driven by force reticle drive system 29, which is a reticle drive system 29, and position information of reticle stage RST is measured by control reticle interferometer 30.
  • the measured position information is fed back to the subtracter 62 to form a position control loop.
  • the control target Rp is substantially a reticle stage system including the reticle drive system 29 and the reticle stage RST. Therefore, in the following, it is described as reticle stage system Rp
  • the output terminal of the subtractor 62 constituting the reticle stage control system 94 is connected to the input terminal of an ILC controller 68 as a second learning controller via a circuit opening / closing switch SW2.
  • the output terminal is connected to the first input terminal of the adder 64.
  • the output terminal of the subtractor 62 is connected to the second input terminal of the adder 64. Therefore, when the switch SW2 is ON, the adder 64 outputs to the reticle stage controller 66 an operation signal obtained by adding the position deviation output from the subtractor 62 and the correction value output from the ILC controller 68. .
  • the ILC controller 68 is configured similarly to the above-described ILC controller 58.
  • the correction value output from the ILC controller 68 is also output to the synchronous control unit 80.
  • the synchronous control unit 80 is one of the unit-side computers, and has various roles such as the following a.
  • the synchronous control unit 80 outputs the position command value P to the subtractor 52 based on the instruction from the main controller 50.
  • the synchronous control unit 80 controls the position information (WX, WX, WY, WX, WX) of the wafer stage WST, which is the output of the wafer stage control system 92, for example, when performing control to follow the reticle stage RST to the wafer stage WST.
  • WY based on the following formula (1)
  • Equation (1) the matrix of 3 rows and 4 columns of the first term on the right side is a transform coefficient matrix, and the matrix of 3 rows and 1 column of the second term on the right side is an offset.
  • the c-synchronized control unit 80 based on instructions from the main controller 50, moves the reticle stage RST independently or independently of the wafer stage WS, and the like.
  • the position command value P corresponding to the instruction from the control device 50 is output.
  • the synchronous control unit 80 turns on and off (ON / OFF) the switches SW1 and SW2 according to the set conditions.
  • the synchronous control unit 80 sequentially fetches the correction values output from the ILC controller 58 for each repetition, and divides the sequentially fetched correction value group into time-series data. In a predetermined storage area (buffer memory) of an internal memory (not shown).
  • the switch SW2 is “ON”, the synchronous control unit 80 sequentially fetches correction values output from the ILC controller 68 at each repetition, and stores the sequentially fetched correction value groups as time-series data. Data is stored in the corresponding storage area (buffer memory) of the memory.
  • a buffer memory is prepared in advance in the internal memory of the synchronous control unit 80 for each operating condition, and a force for acquiring a correction value group into each buffer memory is set to a plurality of operating conditions of the wafer stage WST. Is done individually.
  • the synchronous control unit 80 Under the same operating conditions of the wafer stage WST as before, select a buffer memory that stores a group of correction values acquired in advance, and use the group of correction values in the correction buffer as the correction value of the position deviation to control the wafer stage. Input to 92 sequentially through the third input terminal of adder 54 (see end point e). Further, when performing control to follow the reticle stage RST with respect to the wafer stage WST when the switch SW2 is OFF, the synchronous control unit 80 uses the correction value group acquired in advance by the learning controller 68 as the position deviation correction value. Are sequentially input to the reticle stage control system 94 via the third input terminal (see end point f) of the adder 64.
  • reticle stage control unit 33 and wafer stage control unit 78 are indicated by two-dot chain lines.
  • FIG. 4 (A) shows a slit-shaped illumination area on the wafer (an area conjugate with illumination area RA on reticle R, inscribed in effective field PL ′ of projection optical system PL;
  • the relationship between the ST and the shot area S as one sectioned area is shown in a plan view
  • FIG. 4 (B) shows the relationship between the stage movement time and the stage speed.
  • exposure is performed by moving the shot area S with respect to the illumination slit ST in the direction opposite to the arrow Y, but in FIG. 4A, the stage movement time and the stage movement shown in FIG. Illumination slit ST is shown to move with respect to shot area S to correspond to the speed relation table.
  • the center P of the illumination slit ST is positioned at a position away from the end of the shot area S by a predetermined amount, and acceleration of the wafer stage WST is started.
  • reticle stage RST is started to be opposed to wafer stage WST and at an acceleration that is the reciprocal multiple of the projection magnification of the acceleration of wafer stage WST.
  • synchronous control of reticle R and wafer W is started.
  • the time T from the start of acceleration of both stages WST and RST to the start of synchronous control is the acceleration time Call it.
  • the tracking control of the reticle stage RST with respect to the wafer stage WST is performed until the displacement error between the wafer and the reticle has a predetermined relationship, and exposure is started.
  • the time T from the start of the synchronous control to the start of the exposure is called a settling time.
  • the time (T + T) from the start of acceleration to the start of exposure is called a pre-scan time.
  • Moving distance at the time of Yang can be expressed as (1Z2) -aT 2 + aT ⁇ ⁇ .
  • the exposure time ⁇ ⁇ ⁇ during which the exposure is performed by moving at a constant speed is represented by L for the shot area length, and illumination slit.
  • the transfer of the reticle pattern to the shot area S ends.
  • the normal reticles R are alternately scanned (reciprocating scans) to sequentially expose the next shot area. It is necessary to further move reticle R from the end of exposure by the same distance as, and return reticle R to the scanning start position for the next shot area exposure. At this time, the wafer (wafer stage) is moved in the scanning direction corresponding to the reticle (reticle stage). The time for this is the constant speed overscan time (post-settling time) T and the deceleration overscan time T, and (T + T
  • main controller 50 (more precisely, main controller 50) 5 will be described mainly with reference to the flowchart of FIG. 5 showing the processing algorithm of the CPU in FIG.
  • FIG. 6 shows a trajectory in which the center P of the above-mentioned illumination slit ST passes over each shot area, and a solid line part in this trajectory indicates the illumination slit ST at the time of exposure of each shot area.
  • the path of the center P (hereinafter also referred to as “point P”) is shown, the dotted line indicates the movement locus of point P between adjacent shot areas in the same row in the non-scanning direction, and the dashed line indicates the path between different rows.
  • the trajectory of point P is shown. Actually, the force at which the point P is fixed and the wafer W moves. In FIG. 6, in order to facilitate the explanation, the point P (the center of the illumination slit ST) moves on the wafer W. It is illustrated as follows.
  • the main controller 50 transmits a reticle alignment system (not shown), a reference mark plate FM on the wafer stage WST, and a reference mark plate via the unit computers. Preparation work such as reticle alignment using the wafer alignment detection system illustrated, baseline measurement of the wafer alignment detection system, and wafer alignment (EGA method or the like) is performed.
  • a counter n indicating a row number to which a shot area to be exposed belongs and a counter m indicating a shot area number in the row are both initialized to 1 (m ⁇ 1, 1).
  • the various setting information includes control information relating to the position control of the reticle stage and the wafer stage described above, for example, EGA parameters (for example, in the X and Y directions of the wafer) obtained by an EGA type wafer alignment performed prior to exposure.
  • EGA parameters for example, in the X and Y directions of the wafer
  • next step 106 movement of reticle stage RST and wafer stage WST is instructed to synchronous control unit 80.
  • synchronous control unit 80 sets wafer stage to move wafer W to a scanning start position (acceleration start position) for exposure of a first shot on wafer W.
  • the control unit 78 is instructed.
  • wafer stage WST is moved to the above-described acceleration start position by wafer stage control unit 78 via wafer drive system 48.
  • the synchronous control unit 80 monitors the measured values of the interferometer system 76 and the reticle interferometer 30 and, via the wafer stage control unit 78 and the reticle stage control unit 33, respectively, the reticle drive system 29 and the reticle stage control unit 33 described above. Controls the drive system 48 to start relative scanning in the Y-axis direction between the reticle stage RST and the wafer stage WST.
  • main controller 50 waits in step 108 for the acceleration of both stages RST and WST to the target running speed to be completed.
  • the process proceeds to step 110, and the light emission of the light source is started.
  • the synchronization control unit 80 starts the pre-exposure synchronization settling operation of both stages RST and WST.
  • the light emission of the light source is started before the synchronous stabilization of both stages RST and WST is completed and the exposure is started, but the main controller 50 controls the reticle interferometer 30 to perform the measurement. Based on the value, the movement of the predetermined blade of the movable reticle blind is controlled in synchronization with the reticle stage RST, and the extra portion outside the pattern area of the reticle R is exposed. It is the same as a normal scanning 'stepper'.
  • main controller 50 waits for the end of the exposure in step 112.
  • step 112 the determination in step 112 is affirmed, and the flow advances to step 114 to stop laser beam irradiation.
  • the light emission of the light source itself may be stopped, or a shutter (not shown) in the light source may be closed.
  • step 116 by referring to the counter m, it is determined whether or not the count value m is the number of the last shot area in the n-th row (here, the first row), for example, Judge based on the map.
  • the process proceeds to step 118 to increment the counter m by 1, and then proceeds to step 120, where the n-th row and the m-th shot area (
  • various setting information required for exposure of the second shot area S (ie, the second shot) in the first row is transmitted to the synchronous control unit 80.
  • the transmission of the constant information is performed by the synchronous control unit 80 while the wafer stage WST and the reticle stage RST in the scanning direction immediately after the end of the exposure are performing the constant-speed overscan (post-setting) operation. Therefore, the synchronization control unit 80 can receive the various kinds of setting information sent without difficulty and store it in the internal memory. After transmitting the above setting information, main controller 50 synchronizes the movement of both stages RST and WST in the first mode (hereinafter abbreviated as “movement in the first mode”) in step 122. After instructing the control unit 80, the process returns to step 108 and waits for the completion of acceleration of both stages RST and WST to the target scanning speed.
  • step 108 the synchronous control unit 80 executes the movement operation in the first mode.
  • the movement operation in the first mode will be described in detail.
  • both shot areas between the shot areas are exposed.
  • a speed curve V (t) in the scanning direction of wafer stage WST related to the movement operation in the first mode is shown by a solid line, and a speed curve V (t) in the non-scanning direction is represented by yl xl Indicated by dotted lines.
  • the horizontal axis represents time (t).
  • reticle stage RST moves according to a time change curve of a speed having a reciprocal multiple of the projection magnification of speed curve V (t), and therefore yl
  • a speed curve indicating the time change of the speed is sequentially calculated by the synchronous control unit 80, and a position command value is generated by the synchronous control unit 80 based on the speed curve. Then, the force at which the wafer stage WST is controlled by the wafer stage control unit 78 via the wafer drive system 48 in accordance with this command value.Below, in order to make the description easier to understand, the speed curve of FIG. The description is made with reference to the drawings.
  • the scanning direction (scanning direction) will be considered.
  • the shot area S As described above, the shot area S
  • the wafer stage WST moves at a constant speed Vscan during the time T from the exposure end time t in the + Y direction as shown in FIG. Proceed to
  • wafer stage WST moves speed in the Y direction with acceleration start point t as a time reference.
  • Acceleration is performed as described above, and at time t shown in FIG.
  • non-scanning direction non-scanning direction
  • stepping operation between shot areas a moving operation in the non-scanning direction (non-scanning direction) (stepping operation between shot areas) is considered. As shown in FIG. 8, immediately after the exposure of the shot area S is completed, the speed is increased.
  • the X coordinate of the wafer stage is Lx (Lx is the stepping length), and point P has reached point C (Lx, Cy) in FIG.
  • the exposure direction of the previous shot area is When the time ( ⁇ + ⁇ + ⁇ ) elapses from the end time t, the exposure for the next shot area is performed at the time t.
  • acceleration / deceleration ends when a time (T + T) elapses from the end of exposure of the previous shot area.
  • the wafer stage is adjusted so that the X coordinate ⁇ of ⁇ ( ⁇ , By) in FIG. 7, which is the point where acceleration for
  • the wafer stage control unit 78 and the synchronous control unit 80 force wafer stage WST The movement of each direction of X and ⁇ is controlled.
  • the speed V (t) is constantly changing in the non-scanning direction, and the wafer stage WST is always xl in the non-scanning direction.
  • the wafer stage WST performs a stepping operation in parallel with the approach operation in the scan direction without stopping halfway. Therefore, the movement operation (including the scanning direction and the non-scanning direction) of the wafer stage WST between shot areas can be performed in almost the shortest time, and the throughput can be improved.
  • the pre-scan time includes the settling time T for causing the reticle R to completely follow the wafer W, the acceleration / deceleration control in the non-scan direction cannot be performed.
  • the wafer stage control unit 78 and the synchronous control unit 80 perform the same-speed overscan time T in the scan direction of the wafer stage WST following the end of exposure, as is clear from FIG. , Wafer stage WST in non-scan direction
  • the moving operation is to be started, and the non-
  • Control is performed to terminate the acceleration / deceleration control that occurs in the scanning direction. That is, Since the stepping in the non-scanning direction is completed before the start of the synchronization control in the scanning direction, the synchronization control unit 80 performs only the synchronization control in the scanning direction during the settling time T.
  • Synchronous settling time ⁇ is shortened, and correspondingly, constant speed overscan time (after
  • main controller 50 terminates acceleration of both stages RST and WST in step 108 as described above. Waiting for you.
  • the determination in step 108 is affirmed.
  • the processing (including the judgment) in the loop of steps 110 ⁇ 112 ⁇ 114 ⁇ 116 ⁇ 118 ⁇ 120 ⁇ 122 ⁇ 108 is repeated until the judgment in step 116 is affirmed.
  • the second shot area of the n-th row in this case, the second shot area of the first row (second shot S) to the last shot of the n-th row (the first row in this case)
  • Scanning exposure is performed for each yacht area (shot area S) by alternate scanning.
  • the pattern of the reticle R is sequentially transferred to those shot areas.
  • step 116 determines whether the scanning exposure for the last shot area of the first row is completed. If the scanning exposure for the last shot area of the first row is completed, the determination in step 116 is affirmed, and the process proceeds to step 124.
  • step 124 the counter m is initialized to 1, and the counter n is incremented by 1 (m ⁇ l, ⁇ ⁇ ⁇ + 1).
  • step 108 the process returns to step 108, and waits for completion of acceleration of both stages RST and WST to the target scanning speed.
  • the synchronous control unit 80 executes the movement operation in the second mode.
  • the movement operation in the second mode will be described.
  • the movement operation in the second mode is based on the last shot in an odd-numbered row (a row composed of a plurality of shot areas arranged in the non-scanning direction) in the movement trajectory of the point P between the different rows indicated by a dashed line in FIG.
  • shots area A for convenience
  • shots area B start of the exposure of the first shot area
  • 49 50 69 70 is the movement operation of both stages.
  • a speed curve V (t) in the scanning direction of wafer stage WST related to the movement operation in the second mode is shown by a solid line, and a speed curve V (t) in the non-scanning direction is represented by y2 ⁇ 2. Indicated by dotted lines.
  • the horizontal axis represents time (t).
  • Stepping in the axial direction (hereinafter, appropriately referred to as “Y step”) is started.
  • the stepping in the ⁇ -axis direction is performed by moving the wafer stage WST in the ⁇ -axis direction according to a speed change curve similar to the above-described step change between shot regions in the non-scanning direction.
  • wafer stage WST starts accelerating in the Y direction at time t.
  • the step in the Y-axis direction is performed at time t.
  • reticle stage RST may stop moving to the scanning start position when wafer stage WST moves to the deceleration end position after exposure for shot area A described above. In this case, it is sufficient that the wafer stage WST is stopped until the acceleration before the exposure in the shot area B is started. In the moving operation in the second mode, the post-exposure period after the exposure of the shot area A may be eliminated.
  • main controller 50 ends acceleration of both stages RST and WST in step 108 as described above. Waiting for you.
  • the determination in step 108 is affirmed.
  • the processing (including the judgment) in the loop of steps 110 ⁇ 112 ⁇ 114 ⁇ 116 ⁇ 118 ⁇ 120 ⁇ 122 ⁇ 108 is repeated until the judgment in step 116 is affirmed.
  • the first shot area S in the n-th row (in this case, the second row) is
  • the pattern of the reticle R is sequentially transferred to those shot areas.
  • step 116 If the determination in step 116 is affirmative, the process proceeds to step 124.
  • step 124 the counter m is initialized to 1, and the counter n is incremented by 1 (m ⁇ l, ⁇ ⁇ ⁇ + 1).
  • step 136 instruct the synchronous control unit 80 to move both the stages RST and WST of the third mode (hereinafter abbreviated as “movement of the third mode”). Then, wait until both stages RST and WST finish accelerating to the target scanning speed.
  • the synchronous control unit 80 executes the third mode movement operation.
  • the moving operation in the third mode will be described.
  • the movement operation in the third mode is performed after the exposure of the last shot area (referred to as “shot area C” for convenience) in the even-numbered row of the movement trajectory of the point P between the different rows indicated by the dashed line in FIG. Before the start of exposure of the first shot area (referred to as “shot area D” for convenience) in a different row (next row), specifically, between the shot areas S in FIG.
  • a speed curve V (t) in the scanning direction of wafer stage WST related to the movement operation in the third mode is shown by a solid line, and a speed curve V (t) in the non-scanning direction is represented by y3 ⁇ 3. Indicated by dotted lines.
  • the horizontal axis represents time (t).
  • the acceleration condition before the wafer scan exposure and the acceleration condition before the reticle scan need to be matched. It is necessary to stop not only in the direction but also in the scanning direction.
  • the sequence of the movement operation of the wafer stage WST between the shot areas C and D in the scanning direction includes the y3 as shown in the speed curve V (t) shown in FIG.
  • stage WST starts to decelerate. Then, at the point of time ty523 at which the deceleration time T has further elapsed, the deceleration after the exposure is completed, and the scan area D reaches the scanning start position for exposure. At this time t force time ⁇ 2, the wafer stage WST stops moving in the scanning direction.
  • reticle stage RST starts deceleration after exposure to shot area C described above, and scan opening y5 at time T after the start of the deceleration.
  • the wafer stage WST is stopped until acceleration of the wafer stage WST before exposure to the shot area D is started.
  • the movement operation of wafer stage WST in the non-scanning direction is started at time t after the exposure of shot area C is completed, as in the above-described movement operation in the first mode.
  • the wafer stage WST is maintained at a constant coordinate position in the non-scan direction.
  • main controller 50 causes both stages RST and WST to go through step 108 as described above. Waiting for acceleration to end.
  • the determination in step 108 is affirmed. Thereafter, the first shot area in the third row (shot area S in FIG. 6) to the last shot area S (S) in the last row (Nth row)
  • step 108 Until the exposure of 17 76 is completed, the processing of step 108 and subsequent steps is repeated.
  • scan exposure is performed sequentially and alternately along a path as shown in FIG.
  • the exposure is lower than the shot area S in the lower left of FIG.
  • the movement of the wafer stage in the movement operation of the first mode, the second mode, and the third mode, and the movement of the wafer stage in the movement operation of the first mode Learning control is repeatedly performed in advance for reticle stage RST follow-up control with respect to WST, and a correction value group obtained as a result of the learning control is used for the wafer stage WST and reticle during the actual exposure operation. Stage RST position is corrected.
  • the position error with respect to the target position of wafer stage WST is corrected in advance by iterative learning control. Are acquired and stored in the corresponding buffer memory.
  • a correction value group for correcting a tracking error of reticle stage RST with respect to wafer stage WST is acquired and stored in a corresponding buffer memory.
  • switch SW1 in FIG. 3 is set to “ON” and switch SW2 is set to “OFF” by synchronous control unit 80. Then, from the synchronous control unit 80, the ueno and the stage corresponding to the speed curve V (t) (0 ⁇ t ⁇ t) shown in FIG.
  • a position command in the Y axis direction of WST is output.
  • wafer stage control unit 78 controls wafer stage system Wp, and performs a normal scan in which the + Y direction of wafer stage WST is set as the scan direction.
  • the ILC controller 58 calculates a correction value for ascending the positional deviation, which is the difference between the target position of the wafer stage WST in the Y-axis direction and its current position, to zero. It is calculated sequentially. Then, a series of time series data composed of the sequentially calculated correction value group force is stored in the corresponding buffer memory of the internal memory of the synchronous control unit 80.
  • reticle stage RST (0, 0, 0) is input. Therefore, in the former case, reticle stage RST In the latter case, the reticle stage RST returns to the home position and then stops at that position.
  • synchronous control unit 80 outputs a position command corresponding to velocity curve V (t) (t ⁇ t ⁇ t) in FIG. 8 such that the scanning direction of wafer stage WST is opposite to the upward direction. This much yl 3 5
  • Wafer stage control unit 78 controls wafer stage system Wp in response to the placement command. Then, the correction value group in the negative scan of the stage and the stage WST is stored in the corresponding buffer memory of the internal memory of the synchronous control unit 80 in the same manner as described above.
  • the position commands of the WST in the Y-axis direction and the X-axis direction are input from the synchronous control unit 80 to the subtractor 52. Accordingly, the wafer stage control unit 78 follows the same movement trajectory (see FIG. 7) as in the continuous exposure of the shot area S and the shot area S described above.
  • the wafer stage WST is moved.
  • the respective correction values stored in the corresponding buffer memory in the internal memory are added to the third input terminal of the adder 54 in synchronization with the sampling clock.
  • the position of the wafer stage WST in the ⁇ axis direction is corrected.
  • the ILC controller 58 determines the difference between the target position of the stage WST and its current position in the X-axis direction. Correction values for causing the position deviation to gradually approach zero are sequentially calculated. Then, a series of time-series data including the sequentially calculated correction value group power is stored in the corresponding buffer memory in the internal memory of the synchronous control unit 80.
  • the position command is input from the synchronous control unit 80 to the subtractor 52.
  • wafer stage WST is moved by wafer stage control unit 78 according to a movement trajectory that reverses the movement trajectory of FIG.
  • the synchronous control unit 80 corrects the position of the wafer stage in the ⁇ -axis direction in the same manner as described above.
  • ILC controller 58 gradually reduces the positional deviation, which is the difference between the target position of wafer stage WST in the X-axis direction and its current position, to zero. Correction values to be brought closer are sequentially calculated. Then, a series of time-series data including the sequentially calculated correction value group force is stored in the corresponding buffer memory of the internal memory of the synchronization control unit 80.
  • switch SW1 in FIG. 3 is set to “OFF” and switch SW2 is set to “ON” by synchronous control unit 80. Then, from the synchronous control unit 80, the wafer stage WST corresponding to the speed curve V (t) shown in FIG.
  • a position command in the Y-axis direction is output.
  • wafer stage system Wp is controlled by wafer stage control unit 78, and a normal scan of wafer stage WST is performed.
  • the synchronous control unit 80 first applies the corresponding correction values stored in the corresponding buffer memory of the internal memory to the third input terminal of the adder 54 in synchronization with the sampling clock, and the wafer stage Correct the position of the WST in the Y-axis direction.
  • synchronous control unit 80 outputs the target position (R ', R '
  • the stage system Rp is controlled, thereby performing a negative scan following the positive scan of the wafer stage WST with the scan direction in the Y direction of the reticle stage RST.
  • the ILC controller 68 sets a correction value for asymptotically reducing the position deviation, which is the difference between the target position of the reticle stage RST in the Y-axis direction and its current position, to zero. It is calculated sequentially. Then, a series of time-series data including the sequentially calculated correction value group force is stored in the corresponding buffer memory of the internal memory of the synchronization control unit 80.
  • synchronous control unit 80 outputs a position command corresponding to speed curve V (t) in Fig. 8 such that the scanning direction of wafer stage WST is opposite to the upward direction, and the position command is
  • the wafer stage system Wp is controlled by the stage control unit 78 in the same manner as described above, and at the same time, the target position (R ′) from the synchronous control unit 80 is controlled.
  • the correction value group in the positive scan of the stage RST is stored in the corresponding buffer memory of the internal memory of the synchronous control unit 80 in the same manner as described above.
  • the iterative learning in the case of the movement operation in the first mode is completed, and the correction value group for the position deviation during the scan operation of wafer stage WST (positive Scan and negative scan), and a set of position deviation correction values during stepping operation in the X-axis direction (stepping in the X direction (hereinafter referred to as “negative step”)), + X direction Correction value group corresponding to stepping to the reticle stage (hereinafter referred to as “positive direction step”), and correction value group of the tracking error (synchronization error) for the reticle stage RST and stage WST (positive scan, (Including the correction value group corresponding to each negative scan.) Force Stored in the corresponding buffer memory of the internal memory of the synchronous control unit 80. .
  • Correction value groups corresponding to (X step in the direction), (Y step in the positive direction and X step in the negative direction), (Y step in the negative direction and X step in the positive direction) and (Y step in the negative direction and X step in the negative direction) are synchronized. It is stored in the corresponding buffer memory of the internal memory of the unit 80.
  • the movement of each stage in each mode is controlled by the wafer stage control according to the position command from the synchronization control unit 80. Forces executed by the system 92 and the reticle stage control system 94 At this time, the switches SW1 and SW2 are both set to the OFF state. Therefore, when performing any one of the first mode movement operation, the second mode movement operation, and the third mode movement operation, Even in this case, the synchronous control unit 80 controls each correction value that constitutes a correction value group in the movement direction (positive / negative direction of each of the scan, Y step, and X step) of the corresponding wafer stage WST in the corresponding mode in the internal memory.
  • main controller 50 disconnects ILC controller 58 from wafer stage control system 92, and
  • the switches SW1 and SW2 are both turned OFF by the synchronous control control unit 80 in accordance with the set conditions.
  • the settings are made according to the setting conditions.
  • the processing according to the above-described flowchart is performed. For example, when the movement operation in the first mode is instructed from the main controller 50, the wafer stage control is performed based on the position command of the synchronous control unit 80.
  • the reticle stage RST and the wafer stage WST are moved synchronously by the system 92 and the reticle stage control system 94, and during this synchronous movement, the pattern of the reticle R illuminated by the illumination light IL Each is transferred to the shot area (see step 110—step 114).
  • the synchronous control unit 80 provides the wafer stage control system 92 with the corresponding correction value group ( (Referred to as the first correction value group) are sequentially input as correction values for the position deviation, and the corresponding correction values obtained in advance by the ILC controller 66 are supplied to the reticle stage control system 94.
  • a group (referred to as a second correction value group) is sequentially input as a correction value of the position deviation. This allows the wafer stage control system 92 to perform the position correction of the wafer stage WST such that the position deviation of the wafer stage WST gradually approaches zero by using the first correction value group.
  • the position deviation of reticle stage RST (the tracking error of reticle stage RST with respect to wafer stage WST, ie, the synchronization error between both stages RST and WST) is calculated using the second correction value group.
  • the position of the reticle stage RST is corrected so that the value of) approaches zero.
  • the scanning exposure is performed in a state where the synchronization error between the two stages RST and WST is effectively reduced, and the sliding force is also reduced in the first and second stages. Since the correction value group is obtained by the iterative learning control performed in advance as described above, the synchronization error between the two stages RST and WST unique to the exposure apparatus 10 can be reliably reduced. This makes it possible to transfer the pattern formed on the reticle R onto the wafer W with high accuracy.
  • the exposure for one shot area is performed prior to the actual exposure operation, for example, as described in the procedure B. during the operation of acquiring the correction value group related to the movement operation in the first mode.
  • a predetermined path (see FIG. 7) intersecting both in the scanning direction (scanning direction) and the non-scanning direction perpendicular to the scanning direction (scanning direction) in the same manner as in the movement operation between shot areas performed between and the exposure for the next shot area. (See the U-shaped movement trajectory shown.)
  • the position deviation which is the difference between the target position in the non-scanning direction (wafer stage in the X-axis direction) and its current position, is asymptotically reduced to zero. It is performed by the synchronous learning control unit 80 and the wafer stage control unit 78 to obtain a correction value group to be obtained (a correction value group relating to X ⁇ steps).
  • the correction value group for the step obtained in this case is different from the correction value group obtained by the repetitive learning control in which the wafer stage WST is moved only in the non-scanning direction, and is different from the correction value group for the ueno and the stage WST in the scanning direction.
  • the movement value (stepping) between the actual shot areas of the wafer stage WST is considered, and a correction value group relating to X ⁇ step is obtained. .
  • the synchronous control unit 80 corrects the actual position while correcting the position of the wafer stage WST in the non-scanning direction in consideration of the correction value group relating to the X step.
  • the movement between shot areas is performed, and the reticle stage RST and the wafer stage WST for the scanning exposure are moved synchronously in the scanning direction before and after the movement between shot areas, and the reticle pattern is moved to the wafer. Copied to each shot area on W. In this case, when the movement operation between the shot areas of the wafer stage WST is completed and the scanning exposure for the next shot area is started, the movement operation between the shot areas is performed.
  • the positional deviation of the wafer stage WST in the non-scanning direction (X-axis direction) is almost certainly corrected.
  • the synchronization control unit 80 controls the target of the wafer stage WST in the scanning direction.
  • the position of the wafer stage WST in the scanning direction is corrected in consideration of a group of correction values for the scanning that makes the position deviation, which is the difference between the position and the current position, asymptotic to zero.
  • the correction value group relating to the step not only takes into account the effect of the movement of the wafer stage WST in the scanning direction on the movement in the non-scanning direction, but also reduces the position error in the scanning direction. The effect on X steps is removed.
  • the correction value group relating to scanning is a force that can be obtained in advance by an experiment (including simulation), etc.
  • the correction value group relating to the scan is repeatedly obtained by learning control while moving the ueno and the stage WST in the running direction. For this reason, it is possible to reliably remove the influence of the positional error in the scanning direction on the step in the stepping between the shot areas of the WST of the wafer stage specific to the exposure apparatus 10 on the step.
  • stage control system 90 for controlling wafer stage WST and reticle stage RST (more precisely, synchronous control unit 80 constituting stage control system 90)
  • Operating conditions e.g., the above-mentioned first mode-third mode, scan, or Y
  • the position of the wafer stage WST is corrected based on a correction value group corresponding to the positive and negative directions of the step and the X step).
  • a plurality of operations are performed as operating conditions of the wafer stage WST. Conditions are set, and for each of the plurality of operating conditions, a plurality of correction value groups for ascending the position deviation, which is the difference between the target position and the current position of the wafer stage WST, to zero are repeatedly obtained in advance by learning control. I have.
  • the synchronous control unit 80 constituting the stage control system 90 outputs a plurality of correction value groups corresponding to the operating conditions at that time (specified (set) by the main controller 50).
  • the wafer stage WST is controlled in accordance with the operating conditions while selecting the correction value group and correcting the position of the wafer stage WST based on the correction value group.
  • a stage control system 90 for controlling reticle stage RST and wafer stage WST performs both stages RST and WST in the scanning direction between the exposure of one shot area and the exposure of the next shot area.
  • a certain period of time has elapsed from the end of exposure of one shot area. After the lapse, acceleration of both stages RST and WST is started.
  • the synchronous control unit 80 constituting the stage control system 90 manages the lapse of a fixed time (T + T + T + ⁇ 1) at the end of the exposure of the shot area A. And that certain time has passed
  • both stages RST and WST will start accelerating.
  • the synchronous control unit 80 constituting the stage control system 90 waits for a predetermined time (T + T + T + ⁇ 2) from the exposure end time t of the shot area C.
  • Vibration generated when both stages are decelerated after the exposure of the shot area is attenuated during stoppage, but the reproducibility of the attenuation of the vibration can be increased.
  • the same vibration vibration within an allowable level
  • the wafer stage WST and reticle stage RST are repeated.
  • repetition Learning control is a force that is effective against phenomena such as reproducible vibration.
  • the synchronous control unit 80 force switch SW1 is turned "ON" and the ILC controller 58 is turned on before acquiring the correction value group relating to the movement operation in the first mode.
  • the wafer stage control system 92 While connected to the wafer stage control system 92, with the switch SW2 set to ⁇ OFFJ '' and the reticle stage RST stopped at a predetermined position, the movement operation of the wafer stage WST in the first mode by the wafer stage control unit 78 and the Then, a repetitive learning control is performed, and a correction value group relating to the movement operation of the wafer stage WST in the first mode is stored in the corresponding buffer memory of the internal memory of the synchronous control unit 80 (see the procedures A.
  • step C. the synchronous control unit 80 turns off the switch SW1 and turns on the switch SW2 to turn on the ILC controller 58.
  • the reticle stage RST and the wafer stage WST are synchronously moved by the stage control system 90, that is, the ILC controller 68 is disconnected from the stage control system 92, and the ILC controller 68 is connected to the reticle stage control system 94.
  • the data constituting the corresponding correction value group obtained in advance by the ILC controller 58 are sequentially input to the wafer stage control system 92 as correction values of the position deviation by the synchronous control unit 80.
  • the wafer stage control system 92 performs the position correction of the wafer stage WST such that the position deviation of the wafer stage WST approaches zero using the group of correction values.
  • a correction value group acquired by the learning control by the ILC controller 68 by the synchronous control unit 80 is stored by the synchronous control unit 80. .
  • the synchronous control unit 80 force switches SW1 and SW2 are simultaneously turned ON, the ILC controller 58 is connected to the wafer stage control system 92, and the ILC controller 68 is controlled to the reticle stage. Connect to system 94. Then, in this state, the stage control system 90 connects the reticle stage RST to the wafer stage WST. Synchronous movement, that is, tracking control of reticle stage RST to wafer stage WST may be performed.
  • the synchronous control unit 80 corrects the position deviation of the wafer stage WST acquired by the ILC controller 58 through iterative learning control during the follow-up control of the reticle stage RST.
  • a correction value for correcting the position deviation of the reticle stage RST obtained by the repetitive learning control by the ILC controller 68 are stored in the corresponding buffer memories of the internal memory. That is, a group of correction values for correcting the position deviation of wafer stage WST and a group of correction values for correcting the position deviation of reticle stage RST may be obtained at one time.
  • the obtained correction value groups are used at the time of exposure in the same manner as in the above embodiment, so that the position deviation of the wafer stage WST and the reticle can be obtained in the same manner as in the above embodiment. It is possible to correct the position of both stages such that the position deviation of the stage RST (following error with respect to the wafer stage WST (synchronization error between both stages)) approaches zero.
  • a group of correction values individually corresponding to the movement operation in the first mode-third mode (a plurality of operation sequences of the wafer stage WST) is acquired in advance by the repetitive learning control described above.
  • a stage control system 90 that controls both stages WST and RST performs a plurality of corrections to ascend the position deviation, which is the difference between the target position of the wafer stage WST and the current position, to zero.
  • the case where the position of wafer stage WST is corrected based on the correction value group corresponding to the operating condition selected from the value group has been described.
  • the method of classifying the operating conditions under which the correction value groups are prepared is not limited to this, and the plurality of correction value groups may be used in a wafer stage when continuously exposing a plurality of shot areas in the same row.
  • a correction value group individually corresponding to the operation pattern of the wafer and the operation pattern of the wafer stage when continuously exposing a plurality of shot areas in different rows may be included.
  • the main purpose is to correct the position of the wafer stage based on the correction value group corresponding to the operating condition
  • the correction value group is operated in advance by a simulation or the like that is not a repetitive learning control. It may be obtained for each condition.
  • the learning control is repeatedly performed by moving the stage before performing exposure on the wafer.
  • the above-described learning control is performed while performing exposure.
  • the correction value group may be stored.
  • the correction value group obtained by the repetitive learning control is, for example, the power that can be individually acquired before shipment of each exposure apparatus as information unique to the exposure apparatus and stored in the synchronous control unit.
  • the learning control is repeatedly performed, and the correction value group at that time may be stored.
  • a stage composed of a coarse movement stage and a fine movement stage (hereinafter, referred to as a “coarse / fine movement type stage”) is known.
  • the fine movement stage holds the reticle or wafer, and is a force that can be moved with a relatively short stroke. Its position controllability (including positioning accuracy) is configured to be highly accurate and responsive.
  • the coarse movement stage is configured so that the fine movement stage can be moved over a relatively long distance.
  • the present invention can also be applied to an exposure apparatus having such a coarse / fine movement stage.
  • the control system for the coarse movement stage and the fine movement stage can both be configured as a feedback control system based on the position command value. May be provided.
  • the control system of the fine movement stage that holds the reticle or wafer which is the target of final position control (including positioning)
  • the control system of the reticle stage described in the above embodiment can be used as the control system of the reticle fine movement stage.
  • the reticle fine movement stage can be moved to the wafer stage. It is possible to perform feedback control of a position independent of the wafer stage instead of following control.
  • the reticle stage control system described in the above embodiment is used as the reticle fine movement stage control system, and the wafer stage control system described in the above embodiment is used. May be used as a control system for the wafer fine movement stage. In this case, the reticle fine movement stage is set so that tracking control to the wafer fine movement stage is performed.
  • ultraviolet light having a wavelength of 100 nm or more such as KrF excimer laser light, ArF excimer laser light or F laser light (wavelength 157 ⁇ ), is used as exposure illumination light.
  • the present invention is not limited to this.
  • far ultraviolet (DUV) light belonging to the same deep ultraviolet region as the KrF excimer laser such as g-line and i-line can also be used.
  • a harmonic of a YAG laser may be used.
  • a DFB semiconductor laser or a fiber laser is used to amplify an oscillated single-wavelength laser in the infrared or visible region with, for example, an erbium (or both erbium and ytterbium) force S-doped fiber amplifier, and to perform nonlinear amplification. It is also possible to use harmonics whose wavelength has been converted to ultraviolet light using an optical crystal.
  • the single-wavelength oscillation laser for example, an ittel beam-doped fiber laser can be used.
  • the illumination light for exposure is not limited to light having a wavelength of lOOnm or more, and of course light having a wavelength of less than lOOnm may be used.
  • EUV Extreme Ultraviolet
  • the soft X-ray region for example, a wavelength region of 5 to 15 nm
  • An EUV exposure tool using an all-reflection reduction optical system designed under a wavelength (for example, 13.5 nm) and a reflective mask is being developed.
  • a configuration in which scan exposure is performed by synchronously scanning the mask and the wafer using arc illumination can be considered, so that a powerful apparatus is also included in the scope of the present invention.
  • the present invention can also be applied to an exposure apparatus that uses a charged particle beam such as an electron beam or an ion beam.
  • a charged particle beam such as an electron beam or an ion beam.
  • a circuit pattern is decomposed and formed into a number of subfields of about 250 nm square separated from each other on a mask, and the electron beam is sequentially shifted in a first direction on the mask, and In synchronism with the movement of the mask in the second direction perpendicular to the direction, the wafer is moved relative to the electron optical system that reduces and projects the decomposition pattern, and the reduced image of the decomposition pattern is joined on the wafer to form a composite pattern.
  • the power described in the case where the present invention is applied to the step-and-scan type reduction projection exposure apparatus (scanning stepper), for example, the mirror projection aligner, the proximity type exposure Equipment (eg X-rays
  • scanning stepper for example, the mirror projection aligner, the proximity type exposure Equipment (eg X-rays
  • the present invention can also be applied to a scanning type X-ray exposure apparatus that integrally moves a mask and a wafer relative to an arcuate illumination area.
  • the projection optical system not only a reduction system but also an equal magnification system or an enlargement system (for example, an exposure apparatus for manufacturing a liquid crystal display) may be used.
  • the projection optical system may be any one of a refraction system, a reflection system, and a catadioptric system.
  • the types of glass materials and coating materials that can be used for optical elements (especially refraction elements) are limited by the wavelength of the illumination light for exposure, and the maximum aperture that can be manufactured differs for each glass material.
  • one of the refraction, reflection, and catadioptric systems should be selected. Become.
  • the exposure wavelength is about 190 nm or more
  • synthetic quartz and fluorite can be used as the glass material, so that not only the reflection system and the catadioptric system but also the refractive system is relatively easy. Can be adopted.
  • a refracting system can be used depending on the narrowed wavelength width.
  • an appropriate material other than fluorite is used as a glass material. It is advantageous to employ a reflection system or a catadioptric system, since it is difficult to narrow the wavelength band and it becomes difficult.
  • an electron beam exposure apparatus uses an electron lens and an electron optical system that acts as a deflector.
  • a force that fills the optical path with a gas that reduces the attenuation (for example, an inert gas such as nitrogen or helium) or the optical path is set to vacuum, and the optical path for EUV light or an electron beam is set to vacuum. Apply vacuum.
  • a part of the present invention (at least not including a mask as a constituent requirement) can also be applied to an exposure apparatus that exposes a predetermined pattern directly on a wafer without using a mask (reticle). It is possible.
  • the present invention relates to an exposure apparatus for liquid crystal that transfers a liquid crystal display element pattern onto a square glass plate that can be formed only by an exposure apparatus used for manufacturing a semiconductor element, and a plasma display and an organic EL.
  • the present invention can be widely applied to an exposure apparatus used for manufacturing a chickle.
  • a reticle or mask used in a light exposure device that can be connected only with micro devices such as semiconductor elements, EUV exposure devices, proximity type X-ray exposure devices, and electron beam exposure devices, glass substrates or
  • the present invention is also applicable to an exposure apparatus that transfers a circuit pattern onto a silicon wafer or the like.
  • a transmissive reticle is generally used, and the reticle substrate is quartz glass, fluorine-doped quartz glass, fluorite, or Crystal or the like is used.
  • a reflective mask is used in an EUV exposure apparatus, and a transmission mask (stencil mask, membrane mask) is used in a proximity type X-ray exposure apparatus or a mask projection type electron beam exposure apparatus.
  • a silicon wafer or the like is used as the substrate.
  • FIG. 11 shows a flowchart of an example of manufacturing devices (semiconductor chips such as ICs and LSIs, liquid crystal panels, CCDs, thin-film magnetic heads, micromachines, etc.).
  • a function / performance design of a device for example, a circuit design of a semiconductor device
  • a pattern design for realizing the function is performed.
  • step 202 mask manufacturing step
  • step 203 wafer manufacturing step
  • a wafer is manufactured using a material such as silicon.
  • step 204 wafer processing step
  • step 201-step 203 wafer processing step
  • step 205 device assembly step
  • step 205 includes steps such as a dicing step, a bonding step, and a packaging step (chip sealing) as necessary.
  • step 206 inspection step
  • inspections such as an operation confirmation test and an endurance test of the device created in step 205 are performed. After these steps, the device is complete And this is shipped.
  • Fig. 12 shows a detailed flow example of step 204 in the semiconductor device.
  • step 211 oxidation step
  • step 212 CVD step
  • step 213 electrode formation step
  • step 214 ion implantation step
  • ions are implanted into the ueno.
  • a post-processing step is executed as follows.
  • step 215 resist forming step
  • step 216 exposure step
  • step 217 development step
  • Step 218 etching step
  • step 219 resist removing step
  • the exposure apparatus and the exposure method of the above embodiment are used in the exposure step (step 216), so that the synchronization error between the reticle and the pheno can be reduced as much as possible.
  • scanning exposure is performed, whereby the pattern of the reticle can be transferred onto each shot area on the wafer W with high accuracy.
  • the reticle pattern can be transferred to each shot area on the wafer W with high throughput. Becomes possible. Therefore, according to the device manufacturing method of the present embodiment, it is possible to improve the productivity (including the yield) of a highly integrated device.
  • the exposure method and exposure apparatus of the present invention are suitable for transferring a mask pattern onto a photosensitive object. Further, the device manufacturing method of the present invention is suitable for manufacturing electronic devices such as semiconductor elements and liquid crystal display elements.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Exposure And Positioning Against Photoresist Photosensitive Materials (AREA)
  • Exposure Of Semiconductors, Excluding Electron Or Ion Beam Exposure (AREA)

Abstract

Un dispositif d'exposition comprend: un premier système de commande (92) doté d'un organe de commande (56) conçu pour commander un premier étage en fonction de son écart de position et un organe de commande ILC (58) conçu pour acquérir un groupe de valeurs de correction pour amener l'écart de position graduellement sur zéro par apprentissage à répétition; un second système de commande (94) possédant un organe de commande (66) pour la commande d'un deuxième étage en fonction de son écart de position et un organe de commande ILC (68) conçu pour acquérir un groupe de valeurs de correction pour amener l'écart de position graduellement sur zéro par apprentissage par répétition. Une valeur d'instruction basée sur la position réelle du premier étage est donnée lorsque les deux étages sont déplacés de manière synchronisée. Le dispositif comprend également une unité de commande (80) pour stocker successivement un groupe de valeurs de correction acquis par l'organe de commande ILC connecté au système de commande correspondant et entrant successivement le groupe de valeurs de correction acquis à l'avance par l'organe de commande ILC, en tant que valeur de correction de l'écart de position, dans un système de commande non connecté à l'organe de commande ILC correspondant.
PCT/JP2004/014727 2003-10-10 2004-10-06 Methode d'exposition, dispositif d'exposition et son procede de fabrication Ceased WO2005036620A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005514576A JPWO2005036620A1 (ja) 2003-10-10 2004-10-06 露光方法及び露光装置、並びにデバイス製造方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003-352912 2003-10-10
JP2003352912 2003-10-10

Publications (1)

Publication Number Publication Date
WO2005036620A1 true WO2005036620A1 (fr) 2005-04-21

Family

ID=34431135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/014727 Ceased WO2005036620A1 (fr) 2003-10-10 2004-10-06 Methode d'exposition, dispositif d'exposition et son procede de fabrication

Country Status (2)

Country Link
JP (1) JPWO2005036620A1 (fr)
WO (1) WO2005036620A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008060458A (ja) * 2006-09-01 2008-03-13 Nikon Corp 露光装置及び露光方法
JP2010181955A (ja) * 2009-02-03 2010-08-19 Canon Inc 制御装置
US8140288B2 (en) 2007-04-18 2012-03-20 Nikon Corporation On-machine methods for identifying and compensating force-ripple and side-forces produced by actuators on a multiple-axis stage
US8379187B2 (en) 2007-10-24 2013-02-19 Nikon Corporation Optical unit, illumination optical apparatus, exposure apparatus, and device manufacturing method
US8446579B2 (en) 2008-05-28 2013-05-21 Nikon Corporation Inspection device and inspecting method for spatial light modulator, illumination optical system, method for adjusting the illumination optical system, exposure apparatus, and device manufacturing method
US8451427B2 (en) 2007-09-14 2013-05-28 Nikon Corporation Illumination optical system, exposure apparatus, optical element and manufacturing method thereof, and device manufacturing method
US8462317B2 (en) 2007-10-16 2013-06-11 Nikon Corporation Illumination optical system, exposure apparatus, and device manufacturing method
US8520291B2 (en) 2007-10-16 2013-08-27 Nikon Corporation Illumination optical system, exposure apparatus, and device manufacturing method
US20130271945A1 (en) 2004-02-06 2013-10-17 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US8649885B2 (en) 2008-11-25 2014-02-11 Nikon Corporation Frequency selective iterative learning control system and method for controlling errors in stage movement
US8675177B2 (en) 2003-04-09 2014-03-18 Nikon Corporation Exposure method and apparatus, and method for fabricating device with light amount distribution having light larger in first and second pairs of areas
US8854601B2 (en) 2005-05-12 2014-10-07 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
US9097981B2 (en) 2007-10-12 2015-08-04 Nikon Corporation Illumination optical apparatus, exposure apparatus, and device manufacturing method
US9116346B2 (en) 2007-11-06 2015-08-25 Nikon Corporation Illumination apparatus, illumination method, exposure apparatus, and device manufacturing method
US9140993B2 (en) 2003-10-28 2015-09-22 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9164209B2 (en) 2003-11-20 2015-10-20 Nikon Corporation Illumination optical apparatus, exposure apparatus, and exposure method with optical member with optical rotatory power having different thicknesses to rotate linear polarization direction
US10061214B2 (en) 2009-11-09 2018-08-28 Nikon Corporation Exposure apparatus, exposure method, exposure apparatus maintenance method, exposure apparatus adjustment method and device manufacturing method
JP2022188045A (ja) * 2018-07-04 2022-12-20 キヤノン株式会社 制御装置、露光装置及び物品の製造方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10270343A (ja) * 1997-03-24 1998-10-09 Nikon Corp ステージ制御方法及び走査型露光装置
JP2000260696A (ja) * 1999-03-09 2000-09-22 Canon Inc ステージ制御方法、露光方法、露光装置およびデバイス製造方法
JP2000347741A (ja) * 1999-06-07 2000-12-15 Nikon Corp ステージ制御装置、ステージ装置、及び露光装置
JP2001230183A (ja) * 2000-02-16 2001-08-24 Nikon Corp 走査露光装置、走査露光方法及びデバイス製造方法
JP2002025886A (ja) * 2000-07-03 2002-01-25 Canon Inc ステップ&スキャン式投影露光装置、その保守方法並びに同装置を用いた半導体デバイス製造方法および半導体製造工場
JP2003264134A (ja) * 2002-03-08 2003-09-19 Nikon Corp ステージ制御装置、露光装置、及びデバイス製造方法
JP2003264133A (ja) * 2002-03-08 2003-09-19 Nikon Corp ステージ制御装置、露光装置、デバイス製造方法、及びステージ制御方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10270343A (ja) * 1997-03-24 1998-10-09 Nikon Corp ステージ制御方法及び走査型露光装置
JP2000260696A (ja) * 1999-03-09 2000-09-22 Canon Inc ステージ制御方法、露光方法、露光装置およびデバイス製造方法
JP2000347741A (ja) * 1999-06-07 2000-12-15 Nikon Corp ステージ制御装置、ステージ装置、及び露光装置
JP2001230183A (ja) * 2000-02-16 2001-08-24 Nikon Corp 走査露光装置、走査露光方法及びデバイス製造方法
JP2002025886A (ja) * 2000-07-03 2002-01-25 Canon Inc ステップ&スキャン式投影露光装置、その保守方法並びに同装置を用いた半導体デバイス製造方法および半導体製造工場
JP2003264134A (ja) * 2002-03-08 2003-09-19 Nikon Corp ステージ制御装置、露光装置、及びデバイス製造方法
JP2003264133A (ja) * 2002-03-08 2003-09-19 Nikon Corp ステージ制御装置、露光装置、デバイス製造方法、及びステージ制御方法

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8675177B2 (en) 2003-04-09 2014-03-18 Nikon Corporation Exposure method and apparatus, and method for fabricating device with light amount distribution having light larger in first and second pairs of areas
US9885959B2 (en) 2003-04-09 2018-02-06 Nikon Corporation Illumination optical apparatus having deflecting member, lens, polarization member to set polarization in circumference direction, and optical integrator
US9678437B2 (en) 2003-04-09 2017-06-13 Nikon Corporation Illumination optical apparatus having distribution changing member to change light amount and polarization member to set polarization in circumference direction
US9164393B2 (en) 2003-04-09 2015-10-20 Nikon Corporation Exposure method and apparatus, and method for fabricating device with light amount distribution having light larger in four areas
US9146474B2 (en) 2003-04-09 2015-09-29 Nikon Corporation Exposure method and apparatus, and method for fabricating device with light amount distribution having light larger and different linear polarization states in an on-axis area and a plurality of off-axis areas
US9760014B2 (en) 2003-10-28 2017-09-12 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9140993B2 (en) 2003-10-28 2015-09-22 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9423698B2 (en) 2003-10-28 2016-08-23 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9423697B2 (en) 2003-10-28 2016-08-23 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9140992B2 (en) 2003-10-28 2015-09-22 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9244359B2 (en) 2003-10-28 2016-01-26 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9146476B2 (en) 2003-10-28 2015-09-29 Nikon Corporation Illumination optical apparatus and projection exposure apparatus
US9885872B2 (en) 2003-11-20 2018-02-06 Nikon Corporation Illumination optical apparatus, exposure apparatus, and exposure method with optical integrator and polarization member that changes polarization state of light
US10281632B2 (en) 2003-11-20 2019-05-07 Nikon Corporation Illumination optical apparatus, exposure apparatus, and exposure method with optical member with optical rotatory power to rotate linear polarization direction
US9164209B2 (en) 2003-11-20 2015-10-20 Nikon Corporation Illumination optical apparatus, exposure apparatus, and exposure method with optical member with optical rotatory power having different thicknesses to rotate linear polarization direction
US10241417B2 (en) 2004-02-06 2019-03-26 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US9429848B2 (en) 2004-02-06 2016-08-30 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US10007194B2 (en) 2004-02-06 2018-06-26 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US9423694B2 (en) 2004-02-06 2016-08-23 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US9140990B2 (en) 2004-02-06 2015-09-22 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US20130271945A1 (en) 2004-02-06 2013-10-17 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US10234770B2 (en) 2004-02-06 2019-03-19 Nikon Corporation Polarization-modulating element, illumination optical apparatus, exposure apparatus, and exposure method
US9429851B2 (en) 2005-05-12 2016-08-30 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
US8854601B2 (en) 2005-05-12 2014-10-07 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
US9310696B2 (en) 2005-05-12 2016-04-12 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
US9360763B2 (en) 2005-05-12 2016-06-07 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
US9891539B2 (en) 2005-05-12 2018-02-13 Nikon Corporation Projection optical system, exposure apparatus, and exposure method
JP2008060458A (ja) * 2006-09-01 2008-03-13 Nikon Corp 露光装置及び露光方法
US8140288B2 (en) 2007-04-18 2012-03-20 Nikon Corporation On-machine methods for identifying and compensating force-ripple and side-forces produced by actuators on a multiple-axis stage
US8451427B2 (en) 2007-09-14 2013-05-28 Nikon Corporation Illumination optical system, exposure apparatus, optical element and manufacturing method thereof, and device manufacturing method
US9057963B2 (en) 2007-09-14 2015-06-16 Nikon Corporation Illumination optical system, exposure apparatus, optical element and manufacturing method thereof, and device manufacturing method
US9366970B2 (en) 2007-09-14 2016-06-14 Nikon Corporation Illumination optical system, exposure apparatus, optical element and manufacturing method thereof, and device manufacturing method
US9097981B2 (en) 2007-10-12 2015-08-04 Nikon Corporation Illumination optical apparatus, exposure apparatus, and device manufacturing method
US10101666B2 (en) 2007-10-12 2018-10-16 Nikon Corporation Illumination optical apparatus, exposure apparatus, and device manufacturing method
US8508717B2 (en) 2007-10-16 2013-08-13 Nikon Corporation Illumination optical system, exposure apparatus, and device manufacturing method
US8520291B2 (en) 2007-10-16 2013-08-27 Nikon Corporation Illumination optical system, exposure apparatus, and device manufacturing method
US8462317B2 (en) 2007-10-16 2013-06-11 Nikon Corporation Illumination optical system, exposure apparatus, and device manufacturing method
US9057877B2 (en) 2007-10-24 2015-06-16 Nikon Corporation Optical unit, illumination optical apparatus, exposure apparatus, and device manufacturing method
US8379187B2 (en) 2007-10-24 2013-02-19 Nikon Corporation Optical unit, illumination optical apparatus, exposure apparatus, and device manufacturing method
US9857599B2 (en) 2007-10-24 2018-01-02 Nikon Corporation Optical unit, illumination optical apparatus, exposure apparatus, and device manufacturing method
US9341954B2 (en) 2007-10-24 2016-05-17 Nikon Corporation Optical unit, illumination optical apparatus, exposure apparatus, and device manufacturing method
US9116346B2 (en) 2007-11-06 2015-08-25 Nikon Corporation Illumination apparatus, illumination method, exposure apparatus, and device manufacturing method
US9678332B2 (en) 2007-11-06 2017-06-13 Nikon Corporation Illumination apparatus, illumination method, exposure apparatus, and device manufacturing method
US8456624B2 (en) 2008-05-28 2013-06-04 Nikon Corporation Inspection device and inspecting method for spatial light modulator, illumination optical system, method for adjusting the illumination optical system, exposure apparatus, and device manufacturing method
US8446579B2 (en) 2008-05-28 2013-05-21 Nikon Corporation Inspection device and inspecting method for spatial light modulator, illumination optical system, method for adjusting the illumination optical system, exposure apparatus, and device manufacturing method
US8649885B2 (en) 2008-11-25 2014-02-11 Nikon Corporation Frequency selective iterative learning control system and method for controlling errors in stage movement
JP2010181955A (ja) * 2009-02-03 2010-08-19 Canon Inc 制御装置
US10061214B2 (en) 2009-11-09 2018-08-28 Nikon Corporation Exposure apparatus, exposure method, exposure apparatus maintenance method, exposure apparatus adjustment method and device manufacturing method
JP2022188045A (ja) * 2018-07-04 2022-12-20 キヤノン株式会社 制御装置、露光装置及び物品の製造方法
JP7371190B2 (ja) 2018-07-04 2023-10-30 キヤノン株式会社 制御装置、露光装置及び物品の製造方法

Also Published As

Publication number Publication date
JPWO2005036620A1 (ja) 2006-12-28

Similar Documents

Publication Publication Date Title
WO2005036620A1 (fr) Methode d'exposition, dispositif d'exposition et son procede de fabrication
KR101060982B1 (ko) 노광 방법 및 디바이스 제조 방법, 노광 장치, 그리고 프로그램을 기록한 컴퓨터 판독가능 기록 매체
JP2004072076A (ja) 露光装置及びステージ装置、並びにデバイス製造方法
WO2007123189A1 (fr) Appareil d'exposition, procédé d'exposition et procédé de production de dispositif
JP2008028392A (ja) リソグラフィ装置およびデバイス製造方法
US6483571B1 (en) Exposure apparatus and method for transferring a pattern from a plurality of masks onto at least one substrate
US20040157143A1 (en) Exposure method and lithography system, exposure apparatus and method of making the apparatus, and method of manufacturing device
US20090002659A1 (en) Stage apparatus, exposure apparatus, and method of manufacturing device
JP2003068622A (ja) 露光装置及びその制御方法並びにデバイスの製造方法
JP2001345243A (ja) 評価方法、位置検出方法、露光方法及びデバイス製造方法
JP2001338860A (ja) 露光方法及びデバイス製造方法
JP2005294473A (ja) 露光装置、デバイス製造方法及びデバイス
JP2005322720A (ja) ステージ制御装置及び方法、露光装置及び方法、並びにデバイス製造方法
JP5045927B2 (ja) 露光方法及び露光装置、並びにデバイス製造方法
US6630986B2 (en) Scanning type exposure apparatus and a device manufacturing method using the same
WO2016159200A1 (fr) Dispositif d'exposition, procédé de fabrication d'un écran plat, procédé de fabrication d'un dispositif, et procédé d'exposition
WO1999040487A1 (fr) Procede de commande d'etage, etage mettant en oeuvre ce procede et dispositif d'exposition
JP2003059806A (ja) ステージ駆動方法、露光方法及び露光装置、並びにデバイス製造方法
JP2001135559A (ja) 位置計測方法及び露光方法
JP6610719B2 (ja) 駆動システム及び駆動方法、露光装置及び露光方法、並びにデバイス製造方法
JP2005123220A (ja) ステージ制御方法、露光方法、ステージ制御装置及び露光装置、並びにデバイス製造方法
JP2003197504A (ja) 露光方法及びデバイス製造方法
JP4429296B2 (ja) リソグラフィ装置、投影装置及びデバイス製造方法
JP2006024674A (ja) ステージ制御装置及び方法、露光装置及び方法、並びにデバイス製造方法
JP2005166951A (ja) 露光方法、露光装置及びリソグラフィシステム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005514576

Country of ref document: JP

122 Ep: pct application non-entry in european phase