[go: up one dir, main page]

WO2024121597A1 - Acquisition optical device - Google Patents

Acquisition optical device Download PDF

Info

Publication number
WO2024121597A1
WO2024121597A1 PCT/IB2022/061842 IB2022061842W WO2024121597A1 WO 2024121597 A1 WO2024121597 A1 WO 2024121597A1 IB 2022061842 W IB2022061842 W IB 2022061842W WO 2024121597 A1 WO2024121597 A1 WO 2024121597A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
reflecting
reflecting element
input point
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2022/061842
Other languages
French (fr)
Inventor
Sabino Pisani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visutek 3d GmbH
Original Assignee
Visutek 3d GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visutek 3d GmbH filed Critical Visutek 3d GmbH
Priority to PCT/IB2022/061842 priority Critical patent/WO2024121597A1/en
Publication of WO2024121597A1 publication Critical patent/WO2024121597A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/02Catoptric systems, e.g. image erecting and reversing system
    • G02B17/06Catoptric systems, e.g. image erecting and reversing system using mirrors only, i.e. having only one curved mirror
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • G02B21/20Binocular arrangements
    • G02B21/22Stereoscopic arrangements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/565Optical accessories, e.g. converters for close-up photography, tele-convertors, wide-angle convertors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • G03B35/10Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing

Definitions

  • the present invention relates the technical field of the optical devices .
  • the present invention refers to an acquisition optical device , specifically implementable for acquiring images .
  • An example is in particular the health field in which situations very often occur which require , for diagnostic, surgery, or study reasons , to analyse/display samples (part s / systems of the human body, tis sues , samples ) which are not observable to the naked eye .
  • the pos sibility to obtain a fully three-dimensional displaying of the filmed scenes and/or ob j ect s would allow the real situation being observed to be better and more accurately understood, thus allowing a more accurate and more precise control of the operations being carried out .
  • images have to be acquired which are identifiable as representative of a scene as it would be observed by the human visual system eyes , providing thus acquisition point s of the images which are close but not overlapping and suitably directed .
  • the acquisition of the dif ferent required images is performed in the prior art by installing several distinct acquisition devices generating a remarkable increasing of cost s as well as introducing remarkable technical complications due to the need to synchronize and interface the single acquis ition devices in order to ensure that the acquired and contextually proces sed images are actually corresponding to the same instant of time .
  • the technical task behind the present invention is to propose an acquisition optical device which overcomes at least some of the above-mentioned drawbacks of the prior art .
  • an ob j ect of the present invention to provide an acquisition optical device able to acquire the information required for generating a stereoscopic image in a particularly ef fective and ef ficient manner .
  • the clarified technical task and the specified ob j ect s are substantially achieved by an acquisition optical device comprising the technical characteristics set forth in one or more of the attached claims .
  • the acquisition optical device comprises an optical unit and a single optical sensor .
  • the optical unit comprises in turn input point s , intermediate optical element s , and an output optical element .
  • the input point s comprise a first input point for a first optical signal and a second input point for a second optical signal .
  • the first input point and the second input point are placed side by side at a distance between 1 cm and 80 cm .
  • the output reflecting element has a first reflecting portion and a second reflecting portion mutually placed side by side , contacting and inclined .
  • the single intermediate reflecting element s are interposed between the input point s and the output reflecting element so as to define a first optical path and a second optical path configured to guide respectively the first optical signal on the first reflecting portion and the second optical signal on the second reflecting portion .
  • the optical sensor faces the output reflecting element and is configured to simultaneously acquire the optical signals reflected on the first and the second reflecting portions .
  • the device described here allows the images required for generating a stereoscopic image to be simultaneously acquired by a single optical sensor .
  • the overall production and installation cost s of the device are reduced as a single optical sensor is implemented and the expensive interfacing and synchronizing operations between distinct sensors are not required .
  • FIG. 1 shows a pos sible embodiment of the device according to the present invention
  • FIG. 2 shows in detail one of the constitutive component s of the device .
  • reference number 1 generally refers to an acquisition optical device in accordance with the present invention, which will simply referred to hereinafter as device 1 .
  • Such device 1 is specifically configured to acquire optical signals which may be images ( of an area being monitored, a sample , a scene%) specifically usable for creating a resulting stereoscopic image .
  • the device 1 described here is configured to acquire the images required for generating a representation being perceivable as three- dimensional by part of the human visual system .
  • the device 1 es sentially comprises an optical unit 2 and a single optical sensor 3 .
  • the specific structure of the optical unit 2 allows all the information required for generating the resulting stereoscopic image to be acquired by a single sensor 3 , thereby reducing the overall cost s of the device 1 as well as avoiding the expensive need to interface and synchronize a plurality of distinct optical sensors as instead in the prior art .
  • the optical unit 2 comprises in turn multiple input point s for distinct optical signals , an output reflecting element 5 and a plurality of intermediate reflecting element s 6 interposed between such output element and the distinct input point s .
  • the term "interposed" specifically means that the intermediate reflecting element s 6 are positioned between the input point s and the output reflecting element 5 in particular with respect to an optical path followed by the optical signals inside the unit 2 it self .
  • the optical unit has openings which define or contribute to define respective input point s through which the optical signals enter the optical unit 2 and then interact with the intermediate reflecting element s 6 and finally reach the output reflecting element 5 which is intended, as will be elucidated below, to provide them to the single optical sensor 3 .
  • the optical unit 2 specifically comprise s a first input point 4a and a second input point 4b .
  • a first optical signal 10a can enter the optical unit 2 and correspondingly, through the second input point 4b, a second optical signal distinct and dif ferent from the first one can enter the optical unit 2 .
  • the first and the second input point 4a, 4b are placed at a mutual distance between 1 cm and 80 cm .
  • each input point is intended to reproduce the positioning of a respective eye for acquiring an optical signal (in particular an image ) and therefore it can be observed that the device 1 has input point s which allow an image to be acquired inside the optical unit 2 according to modes corresponding to those of the human visual system .
  • the first input point 4a performs the function of "right eye”
  • the second input point 4b performs the function of "left eye” of the device 1 .
  • the first optical signal 10a represents a scene , image , ob j ect as would be seen/acquired by the right eye
  • the second optical signal represent s the same scene , image , ob j ect as would be seen/acquired by the left eye .
  • the output reflecting element 5 has instead a first reflecting portion 5a and a second reflecting portion 5b .
  • the output reflecting element 5 is a single continuous surface having instead a first reflecting portion 5a placed next to and adj oining to a second reflecting portion 5b, and such portions 5a, 5b define and mark the sides of an angle .
  • the first reflecting portion 5a is intended to receive and reflect the first optical signal 10a, while the second reflecting portion is configured to receive and reflect the second optical signal .
  • the output reflecting element 5 may be made of a piece (for example by a prism) or alternatively comprise a first re flecting component and a second reflecting component (for example two planar mirrors ) mutually abutting and inclined with respect to each other so as to have their own reflecting surface directing towards the respective optical path and facing the optical sensor 3 .
  • the optical signals are directed on the output reflecting element 5 , specifically on the portions 5a, 5b thereof , through the plurality of intermediate reflecting element s 6 , which, as said, are interposed between the input point s and such output reflecting element 5 .
  • the plurality of intermediate reflecting element s 6 is adapted and configured to define a first optical path and a second optical path .
  • the first optical signal 10a is transmitted from the first input point 4a to the first reflecting portion 5a, while through the second optical path, the second optical signal is transmitted from the second input point 4b to the second reflecting portion 5b .
  • the reflecting portions 5a, 5b reflect the optical signals received by the respective input point s 4a, 4b through the intermediate reflecting element s 6 , allowing them to be acquired by the single optical sensor 3 .
  • such single optical sensor 3 has it s visual field focused on the output reflecting element 5 , such that one half thereof acquires the image ( specifically the optical signal ) reflected by the first reflecting portion 5a, while the other half thereof acquires the image ( specifically the optical signal ) reflected by the second reflecting portion 5b .
  • a single optical sensor 3 is able to simultaneously acquire , thus in the same instant of time , two distinct images representing the same scene as observed through the first input point 4a (the right eye of the device 1 ) and the second input point 4b (the left eye of the device 1 ) .
  • the optical sensor 3 receives the optical signals reflected by and on the reflecting portions 4a, 4b, thereby acquiring both the images required for generating the resulting stereoscopic image in a single step and with a single component .
  • the plurality of intermediate reflecting element s 6 comprises exactly a first and a second intermediate reflecting element .
  • the first intermediate reflecting element is interposed between the first input point 4a and the first reflecting portion 5a so as to reflect thereon the first optical signal 10a and similarly the second intermediate reflecting element is interposed between the second input point 4b and the second reflecting portion 5b so as to reflect thereon the second optical signal .
  • the optical unit 2 has overall a symmetry plane interposed between the first input point 4a and the second input point 4b, in particular a vertical symmetry plane .
  • the optical unit 2 is symmetrical/ specular with respect to an own centreline interposed between the input point s 4a, 4b, thereby ensuring an optimal use of the spaces and a greater displaying coherence of the acquired images as both the first optical signal 10a and the second optical signal 10b pas s through respective structurally and functionally identical optical paths .
  • each intermediate reflecting element 6 is installed in the optical unit 2 so as to form an angle between 35 ° and 65 ° with a lying plane of the input point s 4a, 4b .
  • both the input point s lie in the same plane and the intermediate reflecting element s 6 are inclined with respect to such plane so as to define an angle between the above-mentioned values .
  • first and the second reflecting portions 5a, 5b are mutually inclined to define the sides of an angle between 50 ° and 110 ° having the vertex facing the optical sensor .
  • the intermediate reflecting element s 6 are movable in rotation and/or translation with respect to the output reflecting element 5 , thereby allowing to modify and control the specific propagation direction of the respective optical signals 10a, 10b inside the unit 2 and therefore controlling how the latter are reflected by the respective reflecting portions 5a, 5b to be acquired by the optical sensor 3 .
  • first and the second reflecting portions 5a, 5b are also rotatable about a centreline of the output reflecting element 5 , so as to modify the angle of inclination therebetween, that is so as to vary the width of the angle enclosed therebetween .
  • the various angles formed by the reflecting element s of the optical unit 2 together with the mutual distances contribute to define the parallax and disparity content s of the pair of acquired stereo images and the distance at which the zero-disparity condition in the scene observed through the device 1 occurs .
  • the distance between the intermediate reflecting element s 6 af fect s the extent of :
  • disparity means the dif ference in the hori zontal position between two corresponding point s in the ob j ect images which are pro j ected on the optical sensor 3 , and
  • parallax means the displacement of a reference in an image due to a change of perspective or even to a change of the observation point ( for example due to a change of position of one of the two channels of the device 1 with respect to the other ) .
  • the pos sibility to move (translate and rotate ) the intermediate reflecting element s 6 and modify the inclination between the two portions 5a, 5b of the output reflecting element 5 allows an optimal acquisition of the optical signals 10a, 10b (thus of images ) to be ensured to adapt to the distance between device 1 and the ob j ect observed thereby or to manipulate disparity and parallax in the acquired images .
  • the device 1 is able to acquire optimal images and thus to provide the best pos sible information for generating the resulting stereoscopic image even when it is useful or neces sary to change the focus point of the optical sensor 3 and the zerodisparity distance in the stereoscopic scene .
  • the device 1 may comprise automatic moving means ( one or more actuators such as for example servomotors ) operatively connected to the various reflecting element s 5 , 6 and configured to control the positioning and the moving thereof .
  • automatic moving means one or more actuators such as for example servomotors
  • Such type of automatized control allows to control in a particularly accurate and precise manner the reflection angles of the various reflecting element s 5 , 6 ensuring an optimized acquisition of the optical signals 10a, 10b in question .
  • the intermediate reflecting element s 6 and/or the output reflecting element 5 are made/defined by respective mirrors or by suitable prism surfaces .
  • the optical unit 2 may comprise a first prism with a parallelepiped-shaped section, wherein a first inclined surface defines the first intermediate reflecting element and the inclined surface parallel thereto defines the first reflecting portion 5a so as to overall make the first optical path; similarly, in this context the optical unit 2 may comprise a second prism, structurally identical and symmetrical to the first one , which defines in a specular manner the second optical path .
  • the intermediate 6 and output reflecting element s are/define surfaces of first reflection .
  • the optical signals 10a, 10b always directly interact with the respective reflecting element s 5 , 6 without any additional element , such as glas s , coatings or other, interposed therebetween .
  • optical sensor 3 may be or comprises a sensor able to acquire optical signals 10a, 10b inside the visible spectrum and/or out side it .
  • the optical sensor 3 may compri se a hyperspect ral chamber, and/or a thermal chamber, and/or an infrared chamber .
  • the device 1 may also comprise a beam splitter 7 positioned at the first and the second input point s 4a, 4b .
  • the beam splitter 7 is positioned upstream of the input point s 4a, 4b with respect to a propagation direction of the opt ical signals 10a, 10b along the optical unit 2 .
  • the beam splitter 7 is configured to separate both the first and the second optical signals in a respective main component 11a, l ib and a respective auxiliary component 12a, 12b .
  • the main component is directed/ conveyed inside the optical unit 2 (through the respective input point s 4a, 4b ) , while the auxiliary component 12a, 12b is instead directed/ conveyed along an auxiliary optical path .
  • the beam splitter 7 is a non-polarized beam splitter 7 as thereby it is pos sible to avoid that it introduces artifact s and especially that it creates dif ferent images along the main optical path (that leading to the optical unit 2 ) and the auxiliary one .
  • the specular-type reflexes are often polarized, thus by implementing a polarized beam splitter they would be more in one path rather than in the other .
  • the polarized beam splitters are configured to operate with a limited wavelength range (typically in the range 420 - 680 nm) and therefore represent a non- optimal selection .
  • the beam splitter 7 allows the optical s ignals 10a, 10b to be divided in a main component 11a, l ib which is acquired by the single optical sensor 3 , while the auxiliary component 12a, 12b may be addres sed for example towards eyepieces in order to allow a user to directly display them or to further proces s them .
  • a further optical unit 2 identical in structure and functions to the above-described one
  • a further corresponding optical sensor 3 which however may be dif ferent (that is have dif ferent acquisition characteristics ) from the optical sensor 3 receiving the main component 11a, l ib may be positioned along the auxiliary optical path .
  • the two optical sensors 2 could be configured to acquire optical signals 10a, 10b at dif ferent wavelengths .
  • the beam splitter 7 is configured to convey along the auxiliary optical path an auxiliary component 12a, 12b having an optical power between 25% and 75% of the optical power of the first and the second optical signals 10a, 10b .
  • Such division is optimal in order to ensure a correct acquisition of the optical signals 10a, 10b by the optical sensor 3 as well as to simultaneously allow to convey along the auxiliary optical path an optical power suf ficient to be correctly displayed even by a user, if eyepieces for a direct vision by the above- mentioned user are positioned along it , as suggested above .
  • the device 1 may further comprise , along the auxiliary optical path, along the main optical path or upstream of the beam splitter 7 , one or more optical component s such as lenses or foils configured to interact with the optical signals 10a, 10b in order to modify the characteristics thereof .
  • one or more optical component s such as lenses or foils configured to interact with the optical signals 10a, 10b in order to modify the characteristics thereof .
  • the device 1 may comprise one or more lenses configured to match, focus , or modify in a similar manner the optical signals 10a, 10b as they are provided to the optical sensor 3 so as to optimize the acquisition quality thereof .
  • the device 1 also comprises a containment body defining a chamber adapted to house the optical unit 2 and, when present , the beam splitter 7 .
  • Such containment body may further also contain the optical sensor 3 or be coupled thereto (preferably in a reversible manner ) for example through a ring coupling . That is , the optical unit 2 (together with the beam splitter 7 when present ) may be made and provided as an independent and individual element applicable at a later time to the single optical sensor 3 .
  • an optical unit 2 being provided as already coupled ( eventually in a reversible manner ) to the optical sensor 3 allows to obtain a correct and optimal coupling between the two, thereby ensuring the best pos sible quality for acquiring the optical signals 10a, 10b .
  • the containment body may further comprise coupling means through which it is pos sible to reversibly constrain the device 1 to a further device such as a microscope ( an aspect which will be elucidated below) , so as to allow the easy installation and eventual replacement thereof in case of failure or due to maintenance being performed .
  • the device may comprise or be connected to a proces sing unit suitably configured to generate a resulting stereoscopic image as a function of the first and the second optical signals 10a, 10b .
  • the proces sing unit is comprised in the device 1 it may be enclosed inside the containment body .
  • the device 1 may comprise a transmi s sion module coupled to the optical sensor and configured to transmit the first and the second optical signals , preferably by a wireles s communication protocol .
  • Such solution is particularly ef ficient if the device 1 is connected to a remote proces sing unit , since through the transmis sion module the optical signals 10a, 10b to be used by it for generating the resulting stereoscopic image may be transmitted thereto .
  • the transmis sion module may be further configured to directly transmit the resulting stereoscopic image so as to allow the storing (for example by transmitting it to a database such as a computing cloud) or displaying (by transmitting it for example to a s creen or similar displaying device ) thereof .
  • the present invention achieves the proposed ob j ect s overcoming the drawbacks lamented in the prior art by providing to the user an acquisition optical device which allows to implement a s ingle optical sensor for acquiring all the information required for generating a resulting stereoscopic image .
  • Such device 1 is thus more compact , inexpensive , and efficient than known-type devices .
  • a stereoscopic microscope is another ob j ect of the present invention .
  • the stereoscopic microscope comprises in particular an optical device 1 comprising the characteristics set forth above and specifically equipped with a beam splitter 7 .
  • the microscope further comprises a pair of eyepieces , a sample holder table and at least a lens .
  • the pair of eyepieces is suitably positioned along the auxiliary optical path so as to allow a user to directly display the auxiliary component 12a, 12b of the optical signals 10a, 10b specifically with the first optical signal 10a sent to a first eyepiece and the second optical signal sent to a second eyepiece .
  • the main component 11a, l ib is instead sent as usual to the optical unit 2 and acquired through it by the single optical sensor 3 in order to allow the storing and the proces sing thereof in order to generate the resulting stereoscopic image .
  • Such resulting stereoscopic image may be in turn stored and/or directly displayed on a screen so as to allow multiple users to simultaneously see the sample being studied by the microscope .
  • Such sample may be in particular housed on the sample holder table and be magnified or focused by suitably selecting a lens ( or similar optical element ) which is selectively interposable between the sample holder table and the acquisition optical device 1 .
  • the device 1 may be at least partially enclosed in the above-mentioned containment body, which has coupling means , such as ring coupling means , by which it may be constrained, for example screwed, to the structure of the microscope , by interposing it between the eyepieces and the lens .
  • coupling means such as ring coupling means , by which it may be constrained, for example screwed, to the structure of the microscope , by interposing it between the eyepieces and the lens .
  • the present invention may specifically relate also to a stereoscopic camera for an earth-moving equipment .
  • Such camera comprises in particular the device 1 having the technical characteristics set forth above and specifically an optical unit 2 and a single optical sensor 3 .
  • Such representation makes the equipment control easier and more ef ficient by implementing a single camera and more specifically a single optical sensor 3 .
  • cost s may be remarkably reduced as the number of cameras /optical sensors 3 to be purchased and installed on the equipment is halved and the image acquisition system is simplified as synchronizing and interfacing dif ferent cameras to each other i s no longer required .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Studio Devices (AREA)

Abstract

An acquisition optical device comprises an optical unit (2) and a single optical sensor (3). The optical unit (2) comprises in turn a pair of input points for respective optical signals (10a, 10b), and a plurality of reflecting elements configured to guide the optical signals (10a, 10b) on the optical sensor (3). The optical sensor (3) is configured to simultaneously acquire all the optical signals (10a, 10b) guided by the optical unit (2).

Description

DESCRIPTION
ACQUI SITION OPTICAL DEVICE
Field of the art
The present invention relates the technical field of the optical devices .
In more detail , the present invention refers to an acquisition optical device , specifically implementable for acquiring images .
State of the art
As is known, the neurophysiological ability of "stereoscopic" or three-dimensional vision in human beings result s from specific physiological characteristics of the optical system and of how the nervous system reads and proces ses visual signals perceived by each of the two eyes human beings normally have : on the other hand, it is known that image "artificial" generation and display devices which have been historically created in human history (technological but also artistic ) were mainly limited to depict two-dimensional images , in a static or "dynamic" form ( i . e . , in the form of a flow of sequential images thereby reproducing scenes dynamically evolving over time ) because of the dif ficulty of replicating the generation and perceiving of a suf ficiently accurate and realistic three- dimensional image by a human being .
However, the pos sibility to acquire images and reproduce it so as to maintain the three-dimensional perception typical of the human visual system is an is sue which has always met with huge interest in a wide variety of application scopes .
An example is in particular the health field in which situations very often occur which require , for diagnostic, surgery, or study reasons , to analyse/display samples (part s / systems of the human body, tis sues , samples ) which are not observable to the naked eye .
Similarly, in the industrial sector, the use of a wide variety of machines in a remote mode , through which an operator controls at a safe distance a machine which is ( or could be ) operating in conditions being hazardous or harmful for humans , is known .
In the exemplarily above-described context s , the pos sibility to obtain a fully three-dimensional displaying of the filmed scenes and/or ob j ect s would allow the real situation being observed to be better and more accurately understood, thus allowing a more accurate and more precise control of the operations being carried out .
Obviously, such examples are also similarly applicable in other scopes in which it i s generally useful or neces sary to collect and display images for managing a proces s or understanding a situation .
In order to overcome the above-illustrated problem, that is , the lack of accuracy of the information acquired by the traditional systems , implementing acquisition means trying to recreate a three- dimensional representation of what is displayed is known .
Thereby, attempt is made to provide more realist ic and complete information, which is able to allow an observer to evaluate more precisely the spat ial/dimensional characteristics of what he/ she is observing .
However, such systems are notoriously complex and les s performing, in particular as generating a three- dimensional image ( or a video formed by a sequence of images ) , or stereoscopic image , requires acquiring multiple distinct images acquired by dif ferent acquisition point s and subsequently combined and proces sed .
Specifically, images have to be acquired which are identifiable as representative of a scene as it would be observed by the human visual system eyes , providing thus acquisition point s of the images which are close but not overlapping and suitably directed .
Therefore , the acquisition of the dif ferent required images is performed in the prior art by installing several distinct acquisition devices generating a remarkable increasing of cost s as well as introducing remarkable technical complications due to the need to synchronize and interface the single acquis ition devices in order to ensure that the acquired and contextually proces sed images are actually corresponding to the same instant of time .
In other words , it is neces sary to ensure that the generated resulting stereoscopic image has been produced actually using the correct two-dimensional images , that is , images referring to the same instant and acquired by respective , correctly positioned and directed sensors .
Ob j ect of the invention In this context , the technical task behind the present invention is to propose an acquisition optical device which overcomes at least some of the above-mentioned drawbacks of the prior art .
Specifically, it is an ob j ect of the present invention to provide an acquisition optical device able to acquire the information required for generating a stereoscopic image in a particularly ef fective and ef ficient manner .
The clarified technical task and the specified ob j ect s are substantially achieved by an acquisition optical device comprising the technical characteristics set forth in one or more of the attached claims .
In detail , the acquisition optical device comprises an optical unit and a single optical sensor .
The optical unit comprises in turn input point s , intermediate optical element s , and an output optical element .
The input point s comprise a first input point for a first optical signal and a second input point for a second optical signal .
The first input point and the second input point are placed side by side at a distance between 1 cm and 80 cm .
The output reflecting element has a first reflecting portion and a second reflecting portion mutually placed side by side , contacting and inclined .
The single intermediate reflecting element s are interposed between the input point s and the output reflecting element so as to define a first optical path and a second optical path configured to guide respectively the first optical signal on the first reflecting portion and the second optical signal on the second reflecting portion .
The optical sensor faces the output reflecting element and is configured to simultaneously acquire the optical signals reflected on the first and the second reflecting portions .
Advantageously, the device described here allows the images required for generating a stereoscopic image to be simultaneously acquired by a single optical sensor . Thereby, the overall production and installation cost s of the device are reduced as a single optical sensor is implemented and the expensive interfacing and synchronizing operations between distinct sensors are not required .
Brief description of the drawings
Further characteristics and advantages of the present invention will be clearer by the des cription indicating, and therefore not limiting, a preferred, but not exclusive , embodiment of an acquisition optical device , as illustrated in the attached drawings , wherein :
- figure 1 shows a pos sible embodiment of the device according to the present invention ;
- figure 2 shows in detail one of the constitutive component s of the device .
Detailed description of preferred embodiment s of the invention
In the attached figures , reference number 1 generally refers to an acquisition optical device in accordance with the present invention, which will simply referred to hereinafter as device 1 .
Such device 1 is specifically configured to acquire optical signals which may be images ( of an area being monitored, a sample , a scene...) specifically usable for creating a resulting stereoscopic image .
In other words , the device 1 described here is configured to acquire the images required for generating a representation being perceivable as three- dimensional by part of the human visual system . Structurally, the device 1 es sentially comprises an optical unit 2 and a single optical sensor 3 . Advantageously, the specific structure of the optical unit 2 allows all the information required for generating the resulting stereoscopic image to be acquired by a single sensor 3 , thereby reducing the overall cost s of the device 1 as well as avoiding the expensive need to interface and synchronize a plurality of distinct optical sensors as instead in the prior art .
In more detail , the optical unit 2 comprises in turn multiple input point s for distinct optical signals , an output reflecting element 5 and a plurality of intermediate reflecting element s 6 interposed between such output element and the distinct input point s .
For the purposes of the present application, the term "interposed" specifically means that the intermediate reflecting element s 6 are positioned between the input point s and the output reflecting element 5 in particular with respect to an optical path followed by the optical signals inside the unit 2 it self .
Therefore , the optical unit has openings which define or contribute to define respective input point s through which the optical signals enter the optical unit 2 and then interact with the intermediate reflecting element s 6 and finally reach the output reflecting element 5 which is intended, as will be elucidated below, to provide them to the single optical sensor 3 .
Generally, in order to create an image which may be perceived as three-dimensional by the human visual system, there is a need to have at least a first image and a second image , which represent the same scene as displayed by the left eye and the right eye , respectively .
Therefore , the optical unit 2 specifically comprise s a first input point 4a and a second input point 4b .
Through the first input point 4a, a first optical signal 10a can enter the optical unit 2 and correspondingly, through the second input point 4b, a second optical signal distinct and dif ferent from the first one can enter the optical unit 2 .
Advantageously, the first and the second input point 4a, 4b are placed at a mutual distance between 1 cm and 80 cm .
In other words , each input point is intended to reproduce the positioning of a respective eye for acquiring an optical signal ( in particular an image ) and therefore it can be observed that the device 1 has input point s which allow an image to be acquired inside the optical unit 2 according to modes corresponding to those of the human visual system .
Operatively, it may be thus identified by analogy that , using by reference the reference numbers shown in the attached figures , the first input point 4a performs the function of "right eye" , while the second input point 4b performs the function of "left eye" of the device 1 . Consequently, the first optical signal 10a represents a scene , image , ob j ect as would be seen/acquired by the right eye , while the second optical signal represent s the same scene , image , ob j ect as would be seen/acquired by the left eye .
The output reflecting element 5 has instead a first reflecting portion 5a and a second reflecting portion 5b .
Such portions are mutually inclined and placed s ide by side , that is , the output reflecting element 5 is a single continuous surface having instead a first reflecting portion 5a placed next to and adj oining to a second reflecting portion 5b, and such portions 5a, 5b define and mark the sides of an angle .
Specifically, the first reflecting portion 5a is intended to receive and reflect the first optical signal 10a, while the second reflecting portion is configured to receive and reflect the second optical signal .
Structurally, the output reflecting element 5 may be made of a piece ( for example by a prism) or alternatively comprise a first re flecting component and a second reflecting component ( for example two planar mirrors ) mutually abutting and inclined with respect to each other so as to have their own reflecting surface directing towards the respective optical path and facing the optical sensor 3 .
In use , the optical signals are directed on the output reflecting element 5 , specifically on the portions 5a, 5b thereof , through the plurality of intermediate reflecting element s 6 , which, as said, are interposed between the input point s and such output reflecting element 5 .
Specifically, the plurality of intermediate reflecting element s 6 is adapted and configured to define a first optical path and a second optical path .
Through the first optical path, the first optical signal 10a is transmitted from the first input point 4a to the first reflecting portion 5a, while through the second optical path, the second optical signal is transmitted from the second input point 4b to the second reflecting portion 5b .
Thus , the reflecting portions 5a, 5b reflect the optical signals received by the respective input point s 4a, 4b through the intermediate reflecting element s 6 , allowing them to be acquired by the single optical sensor 3 .
Indeed, such single optical sensor 3 has it s visual field focused on the output reflecting element 5 , such that one half thereof acquires the image ( specifically the optical signal ) reflected by the first reflecting portion 5a, while the other half thereof acquires the image ( specifically the optical signal ) reflected by the second reflecting portion 5b .
Thereby, a single optical sensor 3 is able to simultaneously acquire , thus in the same instant of time , two distinct images representing the same scene as observed through the first input point 4a (the right eye of the device 1 ) and the second input point 4b (the left eye of the device 1 ) .
Therefore , the optical sensor 3 receives the optical signals reflected by and on the reflecting portions 4a, 4b, thereby acquiring both the images required for generating the resulting stereoscopic image in a single step and with a single component .
Further, in an optimal configuration, wherein the number of component s and thus the overall dimensions of the device 1 are minimized, the plurality of intermediate reflecting element s 6 comprises exactly a first and a second intermediate reflecting element . Specifically, the first intermediate reflecting element is interposed between the first input point 4a and the first reflecting portion 5a so as to reflect thereon the first optical signal 10a and similarly the second intermediate reflecting element is interposed between the second input point 4b and the second reflecting portion 5b so as to reflect thereon the second optical signal .
Preferably, regardles s of the number of element s forming the plurality of intermediate reflecting element s 6 , the optical unit 2 has overall a symmetry plane interposed between the first input point 4a and the second input point 4b, in particular a vertical symmetry plane .
In other words , the optical unit 2 is symmetrical/ specular with respect to an own centreline interposed between the input point s 4a, 4b, thereby ensuring an optimal use of the spaces and a greater displaying coherence of the acquired images as both the first optical signal 10a and the second optical signal 10b pas s through respective structurally and functionally identical optical paths .
Further, generally, each intermediate reflecting element 6 is installed in the optical unit 2 so as to form an angle between 35 ° and 65 ° with a lying plane of the input point s 4a, 4b .
In other words , both the input point s lie in the same plane and the intermediate reflecting element s 6 are inclined with respect to such plane so as to define an angle between the above-mentioned values .
Similarly, the first and the second reflecting portions 5a, 5b are mutually inclined to define the sides of an angle between 50 ° and 110 ° having the vertex facing the optical sensor .
Advantageously, the intermediate reflecting element s 6 are movable in rotation and/or translation with respect to the output reflecting element 5 , thereby allowing to modify and control the specific propagation direction of the respective optical signals 10a, 10b inside the unit 2 and therefore controlling how the latter are reflected by the respective reflecting portions 5a, 5b to be acquired by the optical sensor 3 .
Furthermore , the first and the second reflecting portions 5a, 5b are also rotatable about a centreline of the output reflecting element 5 , so as to modify the angle of inclination therebetween, that is so as to vary the width of the angle enclosed therebetween .
The various angles formed by the reflecting element s of the optical unit 2 together with the mutual distances contribute to define the parallax and disparity content s of the pair of acquired stereo images and the distance at which the zero-disparity condition in the scene observed through the device 1 occurs .
Specifically, the distance between the intermediate reflecting element s 6 af fect s the extent of :
- the low disparity of the stereoscopic scene , where disparity means the dif ference in the hori zontal position between two corresponding point s in the ob j ect images which are pro j ected on the optical sensor 3 , and
- the parallax in the same scene , where parallax means the displacement of a reference in an image due to a change of perspective or even to a change of the observation point ( for example due to a change of position of one of the two channels of the device 1 with respect to the other ) .
In other words , the pos sibility to move (translate and rotate ) the intermediate reflecting element s 6 and modify the inclination between the two portions 5a, 5b of the output reflecting element 5 allows an optimal acquisition of the optical signals 10a, 10b (thus of images ) to be ensured to adapt to the distance between device 1 and the ob j ect observed thereby or to manipulate disparity and parallax in the acquired images .
In this way, the device 1 is able to acquire optimal images and thus to provide the best pos sible information for generating the resulting stereoscopic image even when it is useful or neces sary to change the focus point of the optical sensor 3 and the zerodisparity distance in the stereoscopic scene .
Advantageously, the device 1 may comprise automatic moving means ( one or more actuators such as for example servomotors ) operatively connected to the various reflecting element s 5 , 6 and configured to control the positioning and the moving thereof .
Such type of automatized control allows to control in a particularly accurate and precise manner the reflection angles of the various reflecting element s 5 , 6 ensuring an optimized acquisition of the optical signals 10a, 10b in question .
In accordance with a pos sible aspect of the present device 1 , the intermediate reflecting element s 6 and/or the output reflecting element 5 (thus the reflecting portions 5a, 5b ) are made/defined by respective mirrors or by suitable prism surfaces .
For example , the optical unit 2 may comprise a first prism with a parallelepiped-shaped section, wherein a first inclined surface defines the first intermediate reflecting element and the inclined surface parallel thereto defines the first reflecting portion 5a so as to overall make the first optical path; similarly, in this context the optical unit 2 may comprise a second prism, structurally identical and symmetrical to the first one , which defines in a specular manner the second optical path .
Generally, that is regardles s of how they are made , the intermediate 6 and output reflecting element s are/define surfaces of first reflection .
That is , the optical signals 10a, 10b always directly interact with the respective reflecting element s 5 , 6 without any additional element , such as glas s , coatings or other, interposed therebetween .
Thereby, particularly accurate images are obtained, as undesired distortive and artificial ef fect s in the optical signals 10a, 10b as they are acquired by the optical sensor 3 are suppres sed/ avoided .
Advantageously, such optical sensor 3 may be or comprises a sensor able to acquire optical signals 10a, 10b inside the visible spectrum and/or out side it .
Specifically, the optical sensor 3 may compri se a hyperspect ral chamber, and/or a thermal chamber, and/or an infrared chamber .
Additionally, the device 1 may also comprise a beam splitter 7 positioned at the first and the second input point s 4a, 4b .
In other words , the beam splitter 7 is positioned upstream of the input point s 4a, 4b with respect to a propagation direction of the opt ical signals 10a, 10b along the optical unit 2 .
Operatively, the beam splitter 7 is configured to separate both the first and the second optical signals in a respective main component 11a, l ib and a respective auxiliary component 12a, 12b .
The main component is directed/ conveyed inside the optical unit 2 (through the respective input point s 4a, 4b ) , while the auxiliary component 12a, 12b is instead directed/ conveyed along an auxiliary optical path .
Preferably, the beam splitter 7 is a non-polarized beam splitter 7 as thereby it is pos sible to avoid that it introduces artifact s and especially that it creates dif ferent images along the main optical path (that leading to the optical unit 2 ) and the auxiliary one . Indeed, the specular-type reflexes are often polarized, thus by implementing a polarized beam splitter they would be more in one path rather than in the other .
Further, the polarized beam splitters are configured to operate with a limited wavelength range (typically in the range 420 - 680 nm) and therefore represent a non- optimal selection .
Thus , the beam splitter 7 allows the optical s ignals 10a, 10b to be divided in a main component 11a, l ib which is acquired by the single optical sensor 3 , while the auxiliary component 12a, 12b may be addres sed for example towards eyepieces in order to allow a user to directly display them or to further proces s them .
Alternatively, a further optical unit 2 ( identical in structure and functions to the above-described one ) and a further corresponding optical sensor 3 which however may be dif ferent (that is have dif ferent acquisition characteristics ) from the optical sensor 3 receiving the main component 11a, l ib may be positioned along the auxiliary optical path .
For example , in this context , the two optical sensors 2 could be configured to acquire optical signals 10a, 10b at dif ferent wavelengths .
Thereby, there would still be a device 1 able to acquire the information required for generating stereoscopic images at dif ferent wavelengths using a single optical sensor 3 for each wavelength in question and/or a single optical sensor 3 for each operat ion or acquisition condition .
Preferably, the beam splitter 7 is configured to convey along the auxiliary optical path an auxiliary component 12a, 12b having an optical power between 25% and 75% of the optical power of the first and the second optical signals 10a, 10b .
Such division is optimal in order to ensure a correct acquisition of the optical signals 10a, 10b by the optical sensor 3 as well as to simultaneously allow to convey along the auxiliary optical path an optical power suf ficient to be correctly displayed even by a user, if eyepieces for a direct vision by the above- mentioned user are positioned along it , as suggested above .
Further, the device 1 may further comprise , along the auxiliary optical path, along the main optical path or upstream of the beam splitter 7 , one or more optical component s such as lenses or foils configured to interact with the optical signals 10a, 10b in order to modify the characteristics thereof .
Specifically, the device 1 may comprise one or more lenses configured to match, focus , or modify in a similar manner the optical signals 10a, 10b as they are provided to the optical sensor 3 so as to optimize the acquisition quality thereof .
In order to optimize the overall strength and the ease of transport / installation, the device 1 also comprises a containment body defining a chamber adapted to house the optical unit 2 and, when present , the beam splitter 7 .
Such containment body may further also contain the optical sensor 3 or be coupled thereto (preferably in a reversible manner ) for example through a ring coupling . That is , the optical unit 2 (together with the beam splitter 7 when present ) may be made and provided as an independent and individual element applicable at a later time to the single optical sensor 3 .
Implementing an optical unit 2 being provided as already coupled ( eventually in a reversible manner ) to the optical sensor 3 allows to obtain a correct and optimal coupling between the two, thereby ensuring the best pos sible quality for acquiring the optical signals 10a, 10b .
Similarly, the containment body may further comprise coupling means through which it is pos sible to reversibly constrain the device 1 to a further device such as a microscope ( an aspect which will be elucidated below) , so as to allow the easy installation and eventual replacement thereof in case of failure or due to maintenance being performed .
In order to generate the desired resulting stereoscopic image , it is neces sary to suitably proces s the optical signals 10a, 10b acquired by the optical sensor and representative of the images as " seen" through the two input point s 4a, 4b of the device 1 .
To this end, the device may comprise or be connected to a proces sing unit suitably configured to generate a resulting stereoscopic image as a function of the first and the second optical signals 10a, 10b .
I f the proces sing unit is comprised in the device 1 it may be enclosed inside the containment body .
Further, the device 1 may comprise a transmi s sion module coupled to the optical sensor and configured to transmit the first and the second optical signals , preferably by a wireles s communication protocol .
Such solution is particularly ef ficient if the device 1 is connected to a remote proces sing unit , since through the transmis sion module the optical signals 10a, 10b to be used by it for generating the resulting stereoscopic image may be transmitted thereto .
Instead, if the control unit is comprised in the device 1 ( or locally connected for example through a wired connection ) , the transmis sion module may be further configured to directly transmit the resulting stereoscopic image so as to allow the storing ( for example by transmitting it to a database such as a computing cloud) or displaying (by transmitting it for example to a s creen or similar displaying device ) thereof .
Advantageously, the present invention achieves the proposed ob j ect s overcoming the drawbacks lamented in the prior art by providing to the user an acquisition optical device which allows to implement a s ingle optical sensor for acquiring all the information required for generating a resulting stereoscopic image . Such device 1 is thus more compact , inexpensive , and efficient than known-type devices .
A stereoscopic microscope is another ob j ect of the present invention .
The stereoscopic microscope comprises in particular an optical device 1 comprising the characteristics set forth above and specifically equipped with a beam splitter 7 .
The microscope further comprises a pair of eyepieces , a sample holder table and at least a lens .
The pair of eyepieces is suitably positioned along the auxiliary optical path so as to allow a user to directly display the auxiliary component 12a, 12b of the optical signals 10a, 10b specifically with the first optical signal 10a sent to a first eyepiece and the second optical signal sent to a second eyepiece .
The main component 11a, l ib is instead sent as usual to the optical unit 2 and acquired through it by the single optical sensor 3 in order to allow the storing and the proces sing thereof in order to generate the resulting stereoscopic image .
Such resulting stereoscopic image may be in turn stored and/or directly displayed on a screen so as to allow multiple users to simultaneously see the sample being studied by the microscope .
Such sample may be in particular housed on the sample holder table and be magnified or focused by suitably selecting a lens ( or similar optical element ) which is selectively interposable between the sample holder table and the acquisition optical device 1 .
In more detail , the device 1 may be at least partially enclosed in the above-mentioned containment body, which has coupling means , such as ring coupling means , by which it may be constrained, for example screwed, to the structure of the microscope , by interposing it between the eyepieces and the lens .
In the context of the present invention, a further application context wherein the device 1 may be advantageously employed may be explicitly identified . Specific reference to the technical field of earthmoving equipment , such as for example a bulldozer, is made .
Thus , the present invention may specifically relate also to a stereoscopic camera for an earth-moving equipment .
Such camera comprises in particular the device 1 having the technical characteristics set forth above and specifically an optical unit 2 and a single optical sensor 3 .
Thus , in this context , it is pos sible to drive such bulldozer remotely by acquiring images relative to its work environment recreating a three-dimensional representation thereof .
Such representation makes the equipment control easier and more ef ficient by implementing a single camera and more specifically a single optical sensor 3 .
Thereby, cost s may be remarkably reduced as the number of cameras /optical sensors 3 to be purchased and installed on the equipment is halved and the image acquisition system is simplified as synchronizing and interfacing dif ferent cameras to each other i s no longer required .
Indeed, all the images required for generating the resulting stereoscopic image are simultaneously acquired by a single sensor .

Claims

1. Acquisition optical device comprising:
- an optical unit (2) comprising:
- a first input point (4a) for a first optical signal (10a) and a second input point (4b) for a second optical signal, said second input point (4b) being placed side by side to the first input point (4a) at a distance between 1 and 80 centimetres ;
- an output reflecting element (5) having a first reflecting portion (5a) and a second reflecting portion (5b) mutually inclined and placed side by side;
- a plurality of intermediate reflecting elements (6) interposed between the input points and the output reflecting element (5) , said plurality of intermediate reflecting elements (6) being adapted to define a first optical path and a second optical path configured to guide respectively the first optical signal (10a) on the first reflecting portion (5a) and the second optical signal on the second reflecting portion (5b) ;
- a single optical sensor (3) facing the output reflecting element (5) and configured to simultaneously acquire the optical signals (10a, 10b) reflected on the first and the second reflecting portions (5b) .
2. Device according to claim 1, wherein the optical unit (2) has a vertical symmetry plane interposed between the first input point (4a) and the second input point (4b) .
3. Device according to claim 1 or 2, wherein the plurality of intermediate reflecting elements (6) only comprises :
- a first intermediate reflecting element (6) interposed between the first input point (4a) and the first reflecting portion (5a) ;
- a second intermediate reflecting element (6) interposed between the second input point (4b) and the second reflecting portion (5b) .
4. Device according to any of the preceding claims, wherein each intermediate reflecting element (6) forms an angle between 35° and 55° with a lying plane of the input points (4a, 4b) and the first and the second reflecting surfaces are mutually inclined to form an angle between 50° and 110° with the vertex facing the optical sensor (3) .
5. Device according to any of the preceding claims, wherein the intermediate reflecting elements (6) are movable in rotation and/or translation with respect to the output reflecting element (5) .
6. Device according to any of the preceding claims, wherein the first and the second reflecting portions (5a, 5b) are rotatable about a centreline of the output reflecting element (5) so as to modify an angle of inclination between said first and second reflecting portions (5a, 5b) .
7. Device according to any of the preceding claims, wherein the intermediate reflecting elements (6) and the output reflecting element (5) define surfaces of first reflection.
8. Device according to any of the preceding claims, comprising a plurality of mirrors adapted to define a respective reflecting element (5, 6) , or a plurality of prisms having surfaces adapted to define a respective reflecting element (5, 6) .
9. Device according to any of the preceding claims, wherein the optical sensor (3) comprises a hyperspect ral chamber and/or a thermal sensor and/or an infrared sensor.
10. Device according to any of the preceding claims, comprising a transmission module coupled to the optical sensor (3) and configured to transmit the first and the second optical signals, preferably by a wireless communication protocol.
11. Device according to any of the preceding claims, comprising or connected to a processing unit configured to generate a resulting stereoscopic image as a function of the first and the second optical signals (10a, 10b) .
12. Device according to any of the preceding claims, comprising a beam splitter (7) positioned at the first and the second input points (4b) and configured to separate the first and the second optical signals in a main component (11a, lib) conveyed in the optical unit (2) and an auxiliary component (12a, 12b) conveyed along an auxiliary optical path.
13. Device according to claim 12, wherein the beam splitter (7) is a non-polarized beam splitter (7) .
14. Device according to claim 12 or 13, wherein the beam splitter (7) is configured to convey along the auxiliary optical path an auxiliary component (12a, 12b) having an optical power between 50% and 75% of the optical power of the first and the second optical signals (10a, 10b) .
15. Device according to any of the preceding claims 11 to 14, comprising a containment body defining a chamber adapted to house the optical unit (2) and the beam splitter ( 7 ) .
16. Device according to claim 15, wherein the containment body is reversibly coupled to the optical sensor (3) , preferably by a ring coupling.
17. Stereoscopic microscope comprising: - an acquisition optical device according to any of the preceding claims 12 to 16;
- a pair of eyepieces positioned along the auxiliary optical path;
- a sample holder table; - at least one lens selectively interposable between the sample holder table and the acquisition optical device .
PCT/IB2022/061842 2022-12-06 2022-12-06 Acquisition optical device Ceased WO2024121597A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/061842 WO2024121597A1 (en) 2022-12-06 2022-12-06 Acquisition optical device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/061842 WO2024121597A1 (en) 2022-12-06 2022-12-06 Acquisition optical device

Publications (1)

Publication Number Publication Date
WO2024121597A1 true WO2024121597A1 (en) 2024-06-13

Family

ID=85037186

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/061842 Ceased WO2024121597A1 (en) 2022-12-06 2022-12-06 Acquisition optical device

Country Status (1)

Country Link
WO (1) WO2024121597A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5532777A (en) * 1995-06-06 1996-07-02 Zanen; Pieter O. Single lens apparatus for three-dimensional imaging having focus-related convergence compensation
US5825532A (en) * 1993-10-04 1998-10-20 Nhk Engineering Services, Inc. Microscopic system integrated with wide-screen television
WO2009051013A1 (en) * 2007-10-19 2009-04-23 Mitaka Kohki Co., Ltd. Head-mounted binocular loupe device
US20130222897A1 (en) * 2010-10-29 2013-08-29 Mitaka Kohki Co., Ltd. Operation microscope
JP2016020977A (en) * 2014-07-15 2016-02-04 キヤノン株式会社 Stereo image pickup device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825532A (en) * 1993-10-04 1998-10-20 Nhk Engineering Services, Inc. Microscopic system integrated with wide-screen television
US5532777A (en) * 1995-06-06 1996-07-02 Zanen; Pieter O. Single lens apparatus for three-dimensional imaging having focus-related convergence compensation
WO2009051013A1 (en) * 2007-10-19 2009-04-23 Mitaka Kohki Co., Ltd. Head-mounted binocular loupe device
US20130222897A1 (en) * 2010-10-29 2013-08-29 Mitaka Kohki Co., Ltd. Operation microscope
JP2016020977A (en) * 2014-07-15 2016-02-04 キヤノン株式会社 Stereo image pickup device

Similar Documents

Publication Publication Date Title
EP2856922B1 (en) Stereoscopic endoscope device
JP4245750B2 (en) Stereoscopic observation device
JP3283084B2 (en) Stereoscopic rigid endoscope
JP2007518491A (en) Convergent optical device for stereoscopic imaging system
JPH06194580A (en) Stereoscopic viewing endoscope and its device
US20140211304A1 (en) Surgical microscope system
EP4354198A1 (en) Head-mounted loupe
JP5946777B2 (en) Stereo imaging device
WO2024121597A1 (en) Acquisition optical device
KR101478270B1 (en) Variable 3-dimensional stereomicroscope assembly
JP2006208407A (en) Stereoscopic microscope system
JP2002085330A (en) Stereoscopic endoscope device
KR20020021707A (en) Dual lens stereo camera
JP2015515025A (en) 3D beam splitter
JP4674094B2 (en) Stereoscopic observation device
JP2001016620A (en) Imaging device, method of determining convergence distance thereof, storage medium, and optical device
JPS62166310A (en) Stereo microscope using solid-state imaging device
JP2001016619A (en) Imaging device, method of determining convergence distance thereof, storage medium, and optical device
JP3816589B2 (en) Stereoscopic endoscope
JP2017006620A (en) Surgical stereoscopic observation device
JP4081156B2 (en) Stereoscopic endoscope
JP7552613B2 (en) Surgical microscope systems and microscope camera adapters
KR200219376Y1 (en) Dual lens stereo camera
JP2961784B2 (en) Image superimposed stereomicroscope
SU938920A1 (en) Endoscope

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22847270

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22847270

Country of ref document: EP

Kind code of ref document: A1