EP4339682A1 - Télescope doté d'au moins un canal de visée - Google Patents
Télescope doté d'au moins un canal de visée Download PDFInfo
- Publication number
- EP4339682A1 EP4339682A1 EP23197401.5A EP23197401A EP4339682A1 EP 4339682 A1 EP4339682 A1 EP 4339682A1 EP 23197401 A EP23197401 A EP 23197401A EP 4339682 A1 EP4339682 A1 EP 4339682A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- telescope
- camera
- image
- display
- viewing channel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/02—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors
- G02B23/10—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors reflecting into the field of view additional indications, e.g. from collimator
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/02—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors
- G02B23/04—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors for the purpose of beam splitting or combining, e.g. fitted with eyepieces for more than one observer
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/12—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/16—Housings; Caps; Mountings; Supports, e.g. with counterweight
- G02B23/18—Housings; Caps; Mountings; Supports, e.g. with counterweight for binocular arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the invention relates to a telescope with at least one viewing channel.
- the invention further relates to an observation and image capture system.
- the object of the present invention is to overcome the disadvantages of the prior art and to provide a device with which ease of use and customer benefit can be increased.
- a telescope of the type mentioned in that it has at least one display which is visible in the at least one viewing channel, in particular which is faded into the at least one viewing channel, particularly preferably reflected.
- the solution according to the invention allows additional information to be displayed for the user together with the image of a distant object generated by the telescope.
- the solution according to the invention therefore enables the user to enrich the generated image with data while viewing the distant object.
- the telescope advantageously has at least one camera.
- the telescope is set up to recognize objects in images recorded by the camera.
- Object recognition makes it possible to display information about the recognized objects to the user and superimpose it on the image of the object.
- the detection can be carried out using artificial intelligence methods such as neural networks.
- the telescope is set up to generate a virtual marking frame and display it on the display, whereby it is further set up to recognize at least one object displayed within the marking frame of the display. This makes it easier to display specific object-related information. If a user wants additional information about an object, for example an animal, in particular a bird, he can specifically capture the object with the telescope and have additional information displayed.
- the Object recognition can occur automatically or by carrying out an additional action by the user, for example by actuating an actuating element, in particular a switch, or by the object under consideration appearing within the marking frame for a predetermined time.
- Automatic object recognition can be carried out, for example, by pre-selecting and activating this function by the user.
- the evaluation it is possible to include other data in the evaluation in addition to analyzing the image content of the captured images. For example, it is advantageous to take location information of the recording location into account in the evaluation (e.g. using GPS), as this is associated with the location probabilities of the object to be detected. It is also conceivable to measure the distance to the object in addition to taking the image. This can be done using a separate or integrated rangefinder. The actual size of the object can be determined from the distance to the object and the subjective object size (e.g. number of pixels in the image). The object size determined in this way or information about the date/time of the recording or the current weather situation can also be taken into account as further parameters in the detection.
- location information of the recording location into account in the evaluation (e.g. using GPS), as this is associated with the location probabilities of the object to be detected.
- measure the distance to the object in addition to taking the image. This can be done using a separate or integrated rangefinder.
- the actual size of the object can be determined from the distance to the object and the subjective object
- the telescope has at least one memory with user- and/or topic- and/or location-specific information, in particular information about locally occurring animal species and/or field names and/or mountain names and/or POIs (Points of Interest). and/or that the telescope has a data interface for data exchange with at least one external memory with user- and/or topic- and/or location-specific information, in particular information on locally occurring animal species and/or field names and/or mountain names and/or POIs .
- the telescope is set up to calculate the image sharpness to be achieved for an image to be recorded with the camera when an actuating element is actuated based on a detected instantaneous movement of the telescope and to indicate to the user whether the recording is being taken a required image sharpness can be made and/or to indicate to the user whether the image sharpness to be achieved is suitable for automatic object recognition.
- the telescope has at least one camera focusing lens and at least one focusing lens arranged in the viewing channel, the telescope being set up to determine a relative position of an image center of a camera image relative to an image center of one in the at least to determine the image displayed in a viewing channel based on a shift in the focusing lenses.
- the telescope is set up to correct a deviation of the image centers, for example by moving the camera image, such that corresponding image sections of the camera image and the image displayed in the viewing channel lie in the center of the image in both images come.
- a field of view of the camera is larger than a field of view of the at least one viewing channel, with a field-side image section captured by an image capture sensor of the camera being larger than a field-side image section captured by the at least one viewing channel.
- a field-side image section captured by an image capture sensor of the camera is larger than a field-side image section captured by the at least one viewing channel.
- the telescope is set up to detect at least one change in position of the telescope.
- the telescope can be set up to provide and save an object imaged in a current position and position of the telescope with a virtual marking.
- This variant of the invention makes it possible to link a specific object, for example a landscape mark, an animal, a mountain peak, a building, etc., with a specific position and orientation of the telescope.
- the telescope is set up to display at least one message when the orientation of the telescope changes compared to the position at which the virtual marking is set, which shows a user a direction in which the virtual marker is located.
- This variant of the invention is particularly advantageous if the telescope is passed on from a first user, for example to a second user standing next to him, since this enables the second user to find and view the marked object very easily.
- the telescope can have a mode selection wheel, preferably arranged on a user-side end face of the telescope, for calling up at least one function of the telescope, different functions being called up in different positions of the mode selection wheel.
- This variant of the invention makes it particularly easy to set different modes of the telescope. For example, automatic image recognition can be selected using this mode dial.
- At least one position of the mode selection wheel can be assigned a function that can be selected by a user. This gives the user the opportunity to access preferred functions very quickly.
- a beam path is formed in each of the viewing channels through an objective, a focusing lens, a reversing system and an eyepiece.
- the camera beam path in the camera tube can be formed by a camera lens, a second camera focusing lens and a camera eyepiece as well as an image capture sensor, the camera eyepiece being arranged between the camera focusing lens and the image sensor.
- the focusing lenses of the viewing channels and the camera focusing lens can be moved together by means of a focusing device.
- a “lens” can be understood to mean both a single lens and a lens system consisting of several lenses.
- first joint part of the first tube and a second joint part of the second tube are arranged adjacent to a lateral surface of the camera tube.
- the camera tube can be designed with a spring arrangement for generating a pivoting resistance between the first joint part of the first tube and the second joint part of the second tube.
- the spring arrangement is arranged around the camera beam path and in particular comprises at least one wave spring.
- This variant of the invention is characterized by the fact that, despite the spring arrangement, the overall length of the camera channel and thus of the entire device can be kept very short.
- the at least one spring arrangement has at least one opening through which a rod runs for moving a focusing lens of the camera beam path.
- the camera tube is firmly connected to one of the two tubes and the camera tube and the tube connected to it can only be pivoted together relative to the other tube.
- the display can be arranged in at least one of the two tubes, preferably in the first tube that is firmly connected to the camera tube.
- a display can also be arranged in both viewing channel tubes.
- a camera is the combination of a lens with an imaging sensor and evaluation electronics, which image electromagnetic radiation (e.g. in the UV, visible or IR spectral range) through the lens onto a two-dimensional sensor (e.g. CCD, CMOS, microbolometer) into electrical image information can convert. It is particularly advantageous if the telescope is set up to detect a pivot angle when setting an eye relief by pivoting the first and second tubes relative to one another and to carry out a position correction of the information shown on the display based on the recorded pivot angle.
- a two-dimensional sensor e.g. CCD, CMOS, microbolometer
- the telescope can have a focusing button for adjusting a focusing; a center of gravity of the telescope can preferably lie in the area of the focusing button.
- an observation and image capture system in that it comprises at least one telescope according to the invention and at least one electronic terminal, the at least one telescope and the at least one electronic terminal being coupled to one another at least temporarily via a connection.
- the observation and image capture system has at least one application program that can be transferred from a server to the telescope by means of the terminal device, and/or that the application program, when installed on the telescope, can be accessed by means of the terminal device and/or is executable.
- functions or parameters of the application program can be changed via access with the terminal device to the at least one application program.
- the telescope is set up to show different information on the display depending on the selection of functions and parameters.
- a telescope 1 according to the invention has a viewing channel 2 or two viewing channels 2 and 3 and at least one camera 4.
- the telescope 1 has a display 5 that is visible in the viewing channel 2, in particular in the one viewing channel 2, and is particularly preferably reflected.
- a display is shown in each of the viewing channels 2, 3.
- the display 5 can be controlled by a controller 7 of the telescope 1 via a display driver 6.
- the controller 7 is preferably a programmable circuit, for example in the form of a processor, in particular a micro or Signal processor.
- the information shown on the display 5 is reflected in the viewing channel 2 in the embodiment shown.
- FIG. 20 A concrete structural design of such a beam path of the viewing channel 2, to which a beam path of the display 5 is coupled for reflecting the display, is shown in the Fig. 20 shown.
- the Fig. 20 shows a longitudinal section of the viewing channel 2 of the telescope 1 together with the beam path of an LCoS display forming the display 5.
- the light from an LED 79 used for illumination is collimated via a condenser 80 and falls on a lens array 81.
- Each individual lens of this array produces a (reduced) intermediate image of the LED on a relatively large area that is independent of the LED size.
- Each of these images illuminates the entire display 5.
- the outline of the individual lenses is precisely mapped onto the display 5 and thus defines the illuminated area.
- the display 5 or the display generated on it is coupled into the viewing channel 2 on the mirror surface on the display prism 9 and is thus imaged in its image plane.
- a Schmidt-Pechan prism system with additional prisms for display reflection is provided.
- the individual LED images form a (relatively large) area in which the display 5 is visible. A big AP, so to speak.
- the display 5 When looking through an eyepiece 8 of the viewing channel 2, a user sees the superimposition of an image of a distant object and a display generated by the display 5.
- the display 5 is displayed via a display prism 9 ( Fig. 20 ) into a beam path of the viewing channel 2 and then into the eye of a user.
- the light rays coming from the display 5 can be deflected by 90° by reflection at a diagonally extending interface 10 of the display prism 9, which is designed as a beam splitter cube, and thus directed in the direction of the eyepiece 8 and into the beam path of the viewing channel 2.
- the display 5 can be illuminated with a lighting device.
- the illumination device can have a light source, the light of which is directed onto the display 5, whereby it can first be focused/collimated by an illumination lens and polarized by a polarizer.
- Liquid crystal molecules of the display 5 can be aligned using electrical voltage so that the light is in the desired direction brightness is reflected. The alignment of the liquid crystal molecules required to generate an image on the display 5 is effected by the controller 7.
- Both the viewing channel 2 and the camera channel can have a cover glass on the object side or just the camera channel alone.
- the viewing channel 2 has an objective lens 82, a focusing lens 83, a reversing system 84 formed by prisms, a field lens 85, and the eyepiece lens 8.
- the above-mentioned optical elements form a first beam path in the viewing channel 2 for the enlarged display of a distant object.
- a second beam path is formed in the camera channel.
- optical elements include, adjoining the cover glass 74, an objective lens 75, a focusing lens 76, an eyepiece lens 77 ( Fig. 17 ) and a camera module 78 or the camera 4.
- the objective lens 75, the focusing lens 76 and the eyepiece lens 77 of the camera channel can together form an afocal lens system.
- the camera module or camera 4 is preferably formed as a unit with an electronic image capture sensor, its own lens and with an integrated autofocus function. If the telescope 1 has a second viewing channel 3, this can be optically constructed in the same way as the viewing channel 2.
- the telescope 1 can have a display unit 10 and/or several illuminated display segments, for example to display a charge status of an energy storage device 11 of the telescope 1.
- the display unit 10 can be illuminated in different colors, with different operating states of the long-range optical device 1 being able to be visualized.
- operational readiness or switching on of the telescope 1 can be signaled by lighting the display unit 10 with a first color.
- one or more electronic control elements 12, 13, 14, 15, for example control buttons, can be provided.
- the telescope can include several sensors 16, 17, 18, 19, 20, such as a geoposition detection sensor 16, in particular a GPS, GLONASS, Galileo or Beidou receiver. Furthermore, it has proven to be particularly advantageous if the telescope also has a brightness sensor 17, an electronic compass 18, an inclination and/or gyro sensor 19, for example a gyroscope, and a sensor 20 for detecting a linear displacement of the focus lenses 76, 83 ( Fig. 17 , 20 ).
- a geoposition detection sensor 16 in particular a GPS, GLONASS, Galileo or Beidou receiver.
- the telescope also has a brightness sensor 17, an electronic compass 18, an inclination and/or gyro sensor 19, for example a gyroscope, and a sensor 20 for detecting a linear displacement of the focus lenses 76, 83 ( Fig. 17 , 20 ).
- the Fig. 21 shows a detail of the viewing channel 3 of the telescope 1 according to the exemplary embodiment Fig. 12 , partially cut and shown in perspective.
- the sensor 20 formed by a linear sensor is soldered to a circuit board. On the one hand, it is aligned with the housing of the viewing channel 2 via two dowel pins and is attached to the housing with a screw. On the other hand, a plunger of the linear sensor engages in a groove in the mount of the focusing lens 83 and is taken along during the focusing movement. The relative displacement can be read out electronically from the sensor 20.
- the sensor 20 is formed by a linear sliding potentiometer.
- the beam paths of the camera channel and the viewing channel 2 or the viewing channels 2, 3 run approximately parallel, so that the center of the camera image when focusing on a distant object corresponds to the center 30 of the image of the viewing channel 2, 3 ( Fig. 3 ).
- the positions of the image centers between the viewing channel 2 and the camera channel are compared with each other.
- the Fig. 22 shows two images of the field of view as they appear to a user when looking through the viewing channels 2 of the telescope 1. Superimposed on these images is, on the one hand, a field of view edge 86 of the viewing channel 2, indicated by a circle, and, on the other hand, a field of view edge 87 of the image capture sensor of the camera 4. In contrast to the situation “object in Infinite” (left illustration), in the situation “Object in the close range” (right illustration), there is a shift in the field of view 87 of the image capture sensor relative to the field of view 86 of the viewing channel 2. In addition, an edge 88 of the display 5 is also shown.
- This deviation is the greater the closer the object being viewed is and is caused by the design-related offset of the two channels to one another (parallax).
- the corresponding deviation of the image centers leads to application errors if, for example, images are centered and recorded using displayed markings.
- a correction is also necessary in the case of autofocus limited to a specific image area or exposure correction or in applications which superimpose or assign object-related information to the viewed image by displaying it in the viewing channel 2, 3.
- the telescope 1 is set up for this purpose, a relative position an image center of a camera image relative to the image center 30 of an image displayed in the at least one viewing channel 2 based on a displacement of the focusing lenses 83, 76 ( Fig.17 , 20 , 21 ) and to correct an image center deviation (parallax correction).
- a relative position an image center of a camera image relative to the image center 30 of an image displayed in the at least one viewing channel 2 based on a displacement of the focusing lenses 83, 76 ( Fig.17 , 20 , 21 ) and to correct an image center deviation (parallax correction).
- the position of the focus lenses 83, 76 it is quite easy to draw conclusions about the distance of the object being viewed and focused on it from the telescope 1 and to calculate the parallax-related displacement from this.
- an area of the camera image can be shifted so far that the center of the shifted camera area again corresponds to the center of the image in the viewing channel 2, 3. It is advantageous here that the
- the telescope 1 can have at least one interface 21 for data transmission to an external device, in particular a mobile radio device or to a second telescope 1.
- the interface 21 is preferably an interface for wireless data transmission.
- the telescope 1 can also have a data interface for wired data exchange, for example a USB interface.
- the telescope 1 can have a WLAN module and/or a mobile radio module, for example a GSM module, and/or a Bluetooth module or an NFC module. Parameters and/or functions can be transmitted from the electronic terminal to the long-range optical device 1 and vice versa via a wireless connection between the femoptical device 1 and an electronic terminal.
- the telescope 1 can have one or more memories 22, which the controller 7 can access.
- images can be stored in a sub-area of this memory 22, while application programs can be stored in other sub-areas, which can be loaded into a main memory of the controller 7 if necessary.
- Partial areas of the memory 22 can also contain data recorded by the sensors 16, 17, 18, 19, 20.
- user and/or topic and/or location-specific information in particular information about locally occurring animal species and/or field names and/or mountain names and/or POIs, can be stored in the memory 22.
- the telescope 1 receives this information from an external memory, for example a server, via a data connection, for example via a wireless data connection, using the interface 21. retrieves.
- the telescope 1 can be set up to recognize objects in images recorded by the camera 4.
- a corresponding image recognition program can be executed by the controller 7. For example, it can be determined which object it is based on data stored in the memory 22.
- the controller 7 controls the display of the information shown on the display 5. So can, as in Fig. 2 shown, on the display 5 as soon as an object is recognized, the name 23 of this object is displayed. Furthermore, icons 24, 25, 26 can also be displayed for currently activated settings or status displays.
- Fig. 2 how out Fig. 2 can also be seen, only an edge area of the display 5 is used to display additional information in order to enable the user to observe as unhindered as possible.
- a boundary 27 of this edge area of the display 5 is only indicated by a frame shown in dashed lines.
- the telescope 1 is set up so that as soon as an object is within this inner boundary 27 of the display 5, image recognition can be triggered. This is achieved, for example, by activating one of the operating elements 12 to 15. Alternatively, the image recognition can of course also be triggered automatically as soon as an object is within the boundary 27.
- the information shown on the display 5 can depend on the currently selected and executed function of the telescope 1.
- information 28 regarding the inclination of the telescope and compass data 29 can be shown on the display 5 and superimposed on the image of the object being viewed.
- the user has the opportunity to call up and change different settings directly on the telescope 1.
- a menu that is shown to him on the display 5 ( Fig. 2 )
- a menu item 32 for example, he can select that he would like to change the compass settings.
- the entry of a declination can, for example, be selected again at a point 34.
- the declination is advantageously entered via a corresponding input field 35 of a smartphone 36 coupled to the telescope 1.
- the use of the smartphone 36 is advantageous in that it makes it easier to enter more complex character sequences.
- An example of the detection and recognition of an object 38, in the form of a falcon, lying partially in a frame 37 arranged within the boundary 27 is shown.
- the icon 24 displayed above the object 38 shows the user that the wireless connection function (WLAN, Bluetooth, etc.) is activated.
- the icon 25, in the shape of a bird, means that a bird detection mode is currently active.
- a charge level of the energy storage 11 is displayed via the icon 26.
- the border 27 and the frame 37 make it easier for the user to move the telescope 1 so that the object 38 comes to rest in the center of the image.
- an actuating element for example one of the actuating elements 12 - 15, in particular in the form of a button, the user can activate different functions.
- Which function can be carried out depends on the duration of the actuation and the force with which the actuating element is actuated. For example, by lightly pressing the actuating element, the frame 37 can be displayed. When the pressure is increased, a photo can then be taken, for example. Or a video recording can be started by pressing twice. Another possibility would be that in Depending on the pressure and duration of the actuating element, object detection is triggered.
- the actuation element can have a first measuring device, with a first function being able to be carried out with a first actuation duration and a second function different from the first function being able to be carried out with a second actuation duration that is different from the first actuation duration.
- the actuating element comprises a measuring device, wherein a first function can be carried out at a first actuation duration, at least a first time interval and at least a second actuation duration, and at a third actuation duration, at least a second time interval and at least a fourth actuation duration a second function that is different from the first function can be carried out.
- the actuation element comprises a further measuring device, with a first function being able to be carried out with a first actuation force and a second function different from the first function being able to be carried out with a second actuation force which is different from the first actuation force.
- the object detection is carried out, triggered by the user or automatically with the appropriate presetting.
- Fig. 6 shows what the user sees when he looks through the viewing channel 2, 3 after the object has been recognized through the eyepiece.
- the name 23 of the bird is displayed below object 38, while a status bar with the icons 24, 25, 26 is displayed above the object.
- Fig. 7 and Fig. 8 show an embodiment of the invention, in which the object 38, which lies partially within an area 39 around the center of the image 30, is provided with a virtual marking 40.
- the spatial orientation of the telescope 1 is recorded and stored in a current position in which it images the object 38.
- the Virtual marking 40 can thus be assigned a specific position (alignment and position) of the telescope 1. If the marked object 38 is outside the center of the image 30 or the area 39, in particular outside the viewer's field of vision, the distance between the virtual marking 40 and the center of the image 30 can be displayed by means of display elements 41, in particular in the form of arrows.
- the current orientation of the telescope is determined via sensors built into the telescope, which are suitable for determining the current orientation and inclination of the optical axis of a viewing channel, in particular the viewing channel with display reflection. This can be done, for example, using an electronic compass and an inclination sensor.
- an electronic compass and an inclination sensor typically have inaccuracies in the range of ⁇ 5-10° and are therefore only of limited use.
- Combined sensors are more suitable for relative alignment measurement, which provide much more precise results from a fusion of several different sensor units, for example from the fusion of a three-axis gyro sensor with a three-axis acceleration sensor.
- a field of view of the camera 4 or the camera channel can be larger than a field of view of the viewing channel 2, 3.
- a field-side image section captured by the image capture sensor of the camera 4 is larger than a field-side image section captured by the viewing channel 2, 3.
- Observation and image capture system 44 shown includes the telescope 1 with application programs 46, 47, 48, 49 installed thereon, and an external terminal 45, for example in the form of a smartphone.
- the external terminal 45 can access some of the application programs 46, 47, 48, 49 that are installed on the telescope 1 via a connection or wireless connection and vice versa.
- Application programs 50 and 51 are also installed on the terminal 45.
- the application programs 50, 51 installed on the terminal 45 interact with application programs 46 and 48 installed on the telescope 1.
- the application programs 50 and 46 as well as 51 and 48 each form a combined application program 52, 53.
- Parameters and/or functions of the application programs 46, 48 can be created or edited on the terminal 45 using the application programs 50, 51, with parameters and/or Functions can be transferred from the terminal 45 to the telescope 1 and vice versa.
- programming interfaces are provided for access from the application programs 50, 51 of the external terminal 45 to the application programs 46, 47, 48, 49 of the telescope 1.
- API programming interfaces
- a grouping of several programming interfaces can be provided for specific use cases.
- the transmission technology required for the respective use of the telescope 1 for communication between the telescope 1 and the external terminal 45 can also be specified.
- the application program 50, 51 on the external terminal 45 also has a key which is used to regulate which set of programming interface groups and associated transmission technology it is allowed to access. It is planned that communication will primarily take place via BLE (Bluetooth Low Energy) in order to save electricity.
- BLE Bluetooth Low Energy
- the observation and image capture system 44 of the telescope 1 is in particular designed to provide common functionalities to the outside (to the application programs 50, 51 of the external terminal 45). Depending on the application program 46, 47, 48, 49 of the telescope 1 that has just been started, a corresponding functionality is provided and the application program 50, 51 of the external terminal 45 is notified of this.
- the observation and image capture system 44 of the telescope 1 also ensures authorization management for the sets of programming interface groups and associated transmission technology. Appropriate connection management (WiFi or BLE) is also carried out.
- the controller 7 is programmed in such a way that after successful transmission of an image to the external terminal 45, this image is automatically deleted from the memory 22. No separate user intervention is required. This allows the memory 22 to be used very economically. In addition, it can also be provided that - if there is already a connection to the external terminal 45 during recording - the image is automatically transmitted to the latter. This functionality can also be provided for several clients at the same time (multiple terminals 45).
- Fig. 11 is a screen of the mobile terminal 45 with several application programs 50, 51, which together with application programs 46, 47 installed on the telescope 1 form the combined application programs 52, 53.
- parameters 54, 55 of one of the combined application programs 52, 53 can then be changed, for example.
- parameter selection and/or function settings can also be transmitted from a terminal 45 to a plurality of telescopes 1 and vice versa.
- the function settings can be, for example, switching on the telescope 1, switching off the telescope 1, coupling the telescope 1 with the terminal 45, downloading recorded images or videos from the telescope 1, taking an image or recording an image sequence or a video, which are executed or started by actuating the actuating element, etc.
- the selection of a plurality from the group of parameters and functions can take place immediately one after the other.
- the program execution of a number of selected functions can take place largely in parallel or with a time delay.
- a first combined application program 52 can be a first mobile application which offers the functions of live streaming, image management and importing updates for the firmware of the telescope 1.
- Live streaming involves a real-time transmission of an image or video recorded by the camera 4 to the coupled terminal device 45. It is also possible to link not just a single terminal device, but several terminal devices at the same time, so that several people can watch it at the same time View live stream.
- a second combined application program 53 can be an identification application for birds. Based on an image of a bird recorded by the camera 4, which is transmitted to the electronic terminal 45, the type of bird can be recognized using an image database (not shown) and an image recognition algorithm. Subsequently, the type of bird can be output on a display device of the electronic terminal 45. It is also conceivable that additional information, such as a description of the species, a bird song and/or a representation of the geographical occurrence, is output on the mobile terminal 45.
- Such a program for bird recognition can also be an application program 47, 49 that is only present on the telescope 1 and which functions autonomously and independently of the external terminal 45 and carries out the bird recognition.
- a third mobile application 51, 52 which can also be implemented by means of a combined application program or by means of an autonomously functioning application program 47, 49 installed on the telescope, can be an identification application for mountains, using a recorded image of mountains , the names of the mountain peaks are issued.
- a fourth mobile application which can be implemented using a combined application program 50, 51, it is also conceivable that a recorded image or an image sequence or a video can be shared with a second operator is shared, with a transmission taking place to a second electronic terminal (not shown).
- the application programs 50, 51 of the terminal 45 and the application programs 46, 47, 48, 49 of the telescope 1 can preferably also be installed by downloading from an external server.
- the telescope 1 has a mode selection wheel 56, preferably arranged on a user-side end face of the telescope 1, for calling up at least one function of the telescope 1.
- Different functions are called up in different positions of the mode selection wheel 56.
- the function is currently being called up, the icon of which is at a defined reference point after turning the mode dial.
- another function is called up.
- At least one position of the mode selection wheel 56 or one position of the mode selection wheel is intended for a function that can be freely selected by a user.
- the user can, for example, access a program for assigning functions via the terminal 45 and assign a function he prefers to the freely assignable position on the mode selection wheel 56.
- a (colored) status LED 91 for displaying different operating modes of the telescope 1 is arranged on the user-side end face of the telescope 1, adjacent to the mode selection wheel 56. This makes it possible for a user to be able to recognize operating states even without looking through the viewing channels 2, 3.
- the telescope 1 can be designed as a binocular. with a first tube 57 and with a second tube 58, the first viewing channel through the first tube 57, which is in Fig. 1 is designated 2, and by the second tube 58 the second, in Fig. 1 View channel designated 3 runs.
- the two tubes 57, 58 are connected to one another by an articulated bridge 59. To set the distance between the eyes, the two tubes 57, 58 can be pivoted about a joint axis of the joint bridge 59.
- a camera tube 60 containing the camera beam path is provided.
- the camera tube 60 forms the joint axis of the joint bridge 59.
- the telescope 1 For focusing, the telescope 1 has a focusing ring 70.
- the center of gravity of the telescope 1 is advantageously located in the area of the focusing ring 70, which means that particularly good handling can be achieved.
- the joint bridge 59 has a first joint part 61, firmly connected to the first tube 57, and a second joint part 62, firmly connected to the second tube 58.
- the two joint parts 61 and 62 are through the in Fig. 15 Camera tube 60 shown connected to each other.
- the first joint part 61 of the first tube 57 and the second joint part 62 of the second tube 58 rest on a lateral surface of the camera tube 60.
- a geometric bending axis of the two tubes 57 and 58 runs within the camera tube 60.
- the bending axis and an optical axis of the camera beam path are arranged coaxially to one another.
- the camera tube 60 is firmly connected to the tube 57 and can be pivoted together with it.
- the display 5 is arranged in one of the two tubes 57, 58, preferably in the tube connected to the camera tube 60.
- the camera tube 60 can be designed with a spring arrangement 63, as shown in FIGS Figures 15 and 16 is shown.
- the spring arrangement 63 is arranged around the camera beam path and includes a wave spring 64.
- the spring arrangement 63 has an opening 65 through which the push rod (reference number 67 in Fig. 17 and 18 ) for moving a focusing lens (reference number 76 in Fig. 17 ) of the camera beam path.
- a section of the spring arrangement 63 protruding in the direction of the lens from an element surrounding the camera tube 60 with a passage opening for the push rod is supported in a mounted State against a section of the joint part 61 and generates a pivoting resistance when the two tubes 57, 58 bend.
- Fig. 17 shows the optical system of the camera channel.
- sensors such as an eyepiece lens, an objective lens or a focusing lens are discussed below and the term “lens” is used in the singular, this should not be understood in a restrictive way, but also that a system of several lenses is or can be meant. This is common practice in technical optics to avoid or compensate for imaging errors.
- the camera channel has a cover glass 74 on the object side and, connected to the cover glass 74, a lens 75 and a focusing lens 76 as well as an eyepiece 77 and a camera module 78.
- the objective 75, the focusing lens 76 and the eyepiece 77 of the camera channel together form an afocal lens system.
- the camera module 78 is preferably formed as a unit with an electronic image capture sensor, its own lens and with an integrated autofocus function.
- the focusing lens 77 can be moved using the push rod 67.
- Fig. 18 shows parts of the adjustment mechanism 66 for moving the focusing lenses of the camera beam path and the viewing beam paths 2, 3.
- the adjustment mechanism 66 has a push rod 67 and two drivers 68, 69 for jointly moving the focusing lenses of the viewing channels 2, 3 and the focusing lens of the camera channel on.
- the push rod 67 of the adjustment mechanism 66 is coupled to corresponding displaceable lens mounts 71, 72, 73 of the viewing channel 2 or 3 and the camera channel by means of the first driver 68 and the second driver 69.
- the focusing ring 70 can act on the adjustment mechanism 66 via corresponding control grooves (not shown) in such a way that the push rod 67 is moved parallel to the optical axes of the viewing channels 2, 3 and the camera channel.
- the focusing lenses of the viewing channels 2, 3 on the one hand and the focusing lens of the image capture channel 3 on the other hand are ultimately displaced in the axial direction.
- the focusing ring 70, the push rod 67 and the two drivers 68, 69 thus form a focusing device, by means of which the focusing lens of the viewing channel 2, 3 and the focusing lens of the camera channel 3 can be moved together.
- the joint displacement of the focusing lenses also causes an axial displacement of the image planes of the distant object in the beam path of the camera channel.
- This shift of the image planes in the camera channel has the effect of a presetting or a rough setting of the image sharpness of the camera channel.
- a subsequent fine adjustment of the image sharpness is finally brought about by an autofocus function of the camera 4 or the camera module 78.
- the lens which can be changed by the autofocus function of the camera module 78, is automatically adjusted so that a sharp image of the distant object is displayed on the light-sensitive sensor surface.
- the automatic focusing of the image in the camera channel with the autofocus function of the camera 4 or the camera module 78 is preferably started immediately after the actuation element is actuated to trigger an image recording.
- the activation of the autofocus function of the camera 4 can also be triggered programmatically by the controller 7.
- movements of the focusing lens can be monitored by the controller 7 with the help of sensors that may be provided.
- the autofocus function can be triggered.
- the automatic triggering of autofocusing after completion of manual focusing also has the advantage that when an image/video recording is triggered by pressing the control button, another autofocusing can be omitted. This significantly speeds up the entire shooting process, as the time between the shutter release and the actual image capture is noticeably shortened.
- the image quality or the image sharpness is also influenced by the state of motion of the telescope 1 at the time the image is taken.
- this so-called freehand limit is only a rough guideline and the image quality is influenced by other factors, such as the camera resolution or the size of the pixels, the lens quality and the hand-holding technique, ie how still and stable the telescope 1 is held by the user during the recording .
- the holding technique for binocular users can vary greatly between beginners and professionals.
- the holding technique is the most important influencing factor.
- it is therefore essential to coordinate the exposure time and hand tremor.
- the movement of the device is detected in the telescope 1 with an acceleration sensor or with the gyro sensor 19 ( Fig. 1 ).
- the movement status of the device can thus be measured in the millisecond range as well as in the sub-millisecond range and the image recording is monitored on this basis.
- a further application program is provided in the controller 7, which carries out the necessary evaluations of the sensor signals from the acceleration sensor or the gyro sensor 19.
- the user Based on the results of comparisons of the exposure time with limit values that can be expected to produce images with sufficient image sharpness, the user receives appropriate information.
- These instructions or warnings from the image recording monitoring program provide the user with assistance in making the decision to trigger an image recording.
- the amplitude of the movement caused by a hand tremor should be below the linear extent of two pixels of the image capture sensor of the camera 4 during the duration of the exposure (+/- 1 pixel).
- this corresponds to 13.1 " (angular seconds).
- the maximum exposure time when panning the binoculars, if you do not want any motion blur caused by panning, is, for example, 1/200 s with a panning movement during exposure of 0.727 °/s.
- the current movement is detected (e.g. as a maximum from an expired interval of 1 s) and based on this and the desired image sharpness, the value of the maximum exposure time is calculated .
- a symbol appears on the display to show the user whether the picture can be taken with sufficient exposure time.
- a warning is displayed to the user if the exposure time is insufficient.
- an exposure time is specified and when the shutter button is pressed, the image sharpness to be achieved is calculated based on the detected instantaneous movement.
- a symbol appears on the display to show the user whether the picture can be taken with the desired image sharpness. The user is thereby informed, for example, whether the captured image will be suitable for automatic object recognition.
- object recognition is completely prevented in such a case by the monitoring program for image recording.
- a series of images are taken in quick succession.
- the movements detected during this can then form the basis so that, for example, only the image with the greatest sharpness is saved. Or the images are ranked according to the value of the image sharpness achieved.
- the state of movement is checked after pressing the release button and the image is only recorded when the movements are small enough for the selected exposure time.
- a maximum possible delay is preferably provided (eg 0.5 s). Above all, the negative influence of hand tremors caused by pressing the release button could be avoided.
- the telescope 1 is set up to detect a tilt angle of the tube with the reflected display relative to the horizontal when setting an eye relief by pivoting the first tube 57 and the second tube 58 relative to one another or by slightly tilting the entire device and based on the detected angle to carry out a position correction of information 23, 24, 25, 26 shown on the display 5.
- the controller 7 can be set up to rotate a representation on the display based on data received from an inclination sensor for detecting the tilt angle in such a way that for a user when he looks through the viewing channel 2, 3, even if the distance between the eyes changes or is easier An upright display is displayed when tilted.
- Fig. 23 shows a cross section of the telescope 1 according to Fig. 12, 13 , shown in perspective.
- a cable harness 89 is shown as a detail of the internal line connection of the telescope 1.
- the energy storage 11 (battery) as the largest component is housed in the viewing channel 3 (left tube), while the mainboard (controller 7) is placed as the second largest component in the viewing channel 3 (right tube).
- a total of ten watts of power must be transmitted.
- five signal-transmitting lines are used.
- the bending movement and, on the other hand, the central camera tube 60 must be taken into account structurally.
- the camera tube 60 stands in the way of the cable harness 89 being passed through directly.
- the wiring harness 89 is therefore converted arranged around the camera tube 60.
- a change in length of the cable must also be taken into account or compensated for in order to determine the extent of bending movement of the two housing parts.
- sufficient sealing of the cable bushings in the two housing tubes must also be ensured.
- Highly flexible cables are selected for the cable harness 89, with the sheathing of the wires being made of an adhesive material.
- a cable channel 90 is formed, in which a section of the cable harness 89 forming a loop can be stowed with minimal kinking.
- the two live lines are divided into four lines to achieve more flexibility.
- These now nine lines of the cable harness 89 are covered in the outer area by a thin, highly flexible shrink tube.
- the strands are guided into the two tubes through sealing sleeves. To do this, the strands are first cast tightly in the sealing bushings. These sealing bushings are then inserted into the housing with the cable from the outside to the inside and sealed tightly.
- the wiring harness 89 is screwed into both tubes and thus relieves strain. As thermal protection for a section of the wiring harness 89 that runs above an IC, this is protected against high temperatures (>80 ° C) by a braided hose.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Telescopes (AREA)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| ATA50715/2022A AT526577B1 (de) | 2022-09-16 | 2022-09-16 | Fernrohr mit zumindest einem Sichtkanal |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4339682A1 true EP4339682A1 (fr) | 2024-03-20 |
Family
ID=88068387
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP23197401.5A Pending EP4339682A1 (fr) | 2022-09-16 | 2023-09-14 | Télescope doté d'au moins un canal de visée |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20240251153A1 (fr) |
| EP (1) | EP4339682A1 (fr) |
| AT (1) | AT526577B1 (fr) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5963369A (en) * | 1996-06-03 | 1999-10-05 | Steinthal; Gregory | Digital solid-state binoculars |
| US20020034004A1 (en) * | 2000-07-14 | 2002-03-21 | Stereovision Imaging, Inc. | Optically multiplexed hand-held digital binocular system |
| US20030063209A1 (en) * | 2001-09-28 | 2003-04-03 | Asahi Kogaku Kogyo Kabushiki Kaisha | Optical viewer instrument with photographing function |
| US20120098972A1 (en) * | 2010-10-22 | 2012-04-26 | Flir Systems, Inc. | Infrared binocular system |
| US20120162775A1 (en) * | 2010-12-23 | 2012-06-28 | Thales | Method for Correcting Hyperstereoscopy and Associated Helmet Viewing System |
| EP3037863A1 (fr) * | 2014-12-23 | 2016-06-29 | Carl Zeiss Sports Optics GmbH | Appareil optique numerique comprenant un pont coude |
| WO2016157923A1 (fr) * | 2015-03-30 | 2016-10-06 | ソニー株式会社 | Dispositif de traitement d'informations et procédé de traitement d'informations |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004020730A (ja) * | 2002-06-13 | 2004-01-22 | Nikon Vision Co Ltd | 像安定化装置 |
| JP2005010687A (ja) * | 2003-06-20 | 2005-01-13 | Pentax Corp | 電子双眼鏡 |
| JP2009157283A (ja) * | 2007-12-27 | 2009-07-16 | Olympus Corp | 観察装置 |
| US11994364B2 (en) * | 2018-08-08 | 2024-05-28 | Sheltered Wings, Inc. | Display system for a viewing optic |
| DE102020106535A1 (de) * | 2020-03-10 | 2021-09-16 | Xion Gmbh | Verfahren zum Anpassen einer Bildmaske |
| JP7686378B2 (ja) * | 2020-09-04 | 2025-06-02 | キヤノン株式会社 | 撮像装置 |
-
2022
- 2022-09-16 AT ATA50715/2022A patent/AT526577B1/de active
-
2023
- 2023-09-13 US US18/466,803 patent/US20240251153A1/en active Pending
- 2023-09-14 EP EP23197401.5A patent/EP4339682A1/fr active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5963369A (en) * | 1996-06-03 | 1999-10-05 | Steinthal; Gregory | Digital solid-state binoculars |
| US20020034004A1 (en) * | 2000-07-14 | 2002-03-21 | Stereovision Imaging, Inc. | Optically multiplexed hand-held digital binocular system |
| US20030063209A1 (en) * | 2001-09-28 | 2003-04-03 | Asahi Kogaku Kogyo Kabushiki Kaisha | Optical viewer instrument with photographing function |
| US20120098972A1 (en) * | 2010-10-22 | 2012-04-26 | Flir Systems, Inc. | Infrared binocular system |
| US20120162775A1 (en) * | 2010-12-23 | 2012-06-28 | Thales | Method for Correcting Hyperstereoscopy and Associated Helmet Viewing System |
| EP3037863A1 (fr) * | 2014-12-23 | 2016-06-29 | Carl Zeiss Sports Optics GmbH | Appareil optique numerique comprenant un pont coude |
| WO2016157923A1 (fr) * | 2015-03-30 | 2016-10-06 | ソニー株式会社 | Dispositif de traitement d'informations et procédé de traitement d'informations |
Also Published As
| Publication number | Publication date |
|---|---|
| AT526577A1 (de) | 2024-04-15 |
| AT526577B1 (de) | 2024-06-15 |
| US20240251153A1 (en) | 2024-07-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AT522470B1 (de) | Fernoptisches Gerät mit Bilderfassungskanal | |
| DE19549048B4 (de) | Teleskop mit Innenfokussierung | |
| EP2824412A1 (fr) | Appareil télé-optique à réglage électronique | |
| EP1774401B1 (fr) | Procede de focalisation de l'objectif de prise de vues d'une camera cinema ou video | |
| EP1126299B1 (fr) | Dispositif à vision nocturne | |
| DE69520529T2 (de) | Blickpunktdetektion verwendende Einrichtung und Verfahren zum Justieren. | |
| EP2219011A1 (fr) | Appareil de mesure géodésique | |
| EP2787323A1 (fr) | Appareil de mesure doté d'une fonction de calibrage d'une position d'image d'indication d'un réticule à croisillon électronique | |
| WO1998010320A1 (fr) | Microscope video compact | |
| EP2860550A1 (fr) | Scanner pour la mesure d'un espace | |
| AT510936B1 (de) | Teleskop mit miteinander verbindbaren modulen | |
| AT518962A1 (de) | Fernoptische Vorrichtung mit einem Absehen | |
| DE102011106453A1 (de) | Verfahren und Vorrichtung zum zeitsequentiellen Aufnehmen dreidimensionaler Bilder | |
| AT9793U1 (de) | Bildaufnahmevorrichtung zum anschliessen an eine beobachtungsvorrichtung und beobachtungsvorrichtung mit derartiger bildaufnahmevorrichtung | |
| AT526577B1 (de) | Fernrohr mit zumindest einem Sichtkanal | |
| AT526578B1 (de) | Fernrohr mit zumindest einem Sichtkanal | |
| AT510937B1 (de) | Modulares teleskop | |
| EP2469317A2 (fr) | Télescope doté d'un objectif et d'un module oculaire | |
| DE602004007587T2 (de) | Objektiv mit variabler Brennweite und Bildaufnahmegerät mit Speichermöglichkeit für vorgewählte Brennweiten | |
| EP3767360B1 (fr) | Appareil optique pour la vision de loin et dispositif de mise au point | |
| DE102018110795A1 (de) | Mikroskop mit Smartdevice | |
| DE60026086T2 (de) | Fokus detektierende Vorrichtung und diese verwendender Apparat | |
| DE4340461A1 (de) | Stereoskopische Bildaufnahmevorrichtung | |
| AT526091B1 (de) | Fernoptische Vorrichtung | |
| DE2027290B2 (de) | Raumbildentfernungsmesser |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20240913 |
|
| RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20251114 |