WO2007119951A1 - Sighting device using virtual camera - Google Patents
Sighting device using virtual camera Download PDFInfo
- Publication number
- WO2007119951A1 WO2007119951A1 PCT/KR2007/001709 KR2007001709W WO2007119951A1 WO 2007119951 A1 WO2007119951 A1 WO 2007119951A1 KR 2007001709 W KR2007001709 W KR 2007001709W WO 2007119951 A1 WO2007119951 A1 WO 2007119951A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- camera
- projectile
- parallel
- trajectory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G1/00—Sighting devices
- F41G1/38—Telescopic sights specially adapted for smallarms or ordnance; Supports or mountings therefor
Definitions
- the present invention relates to sighting device like telescope mounted on gun, and more particularly, to sighting device by virtual camera whose viewing line coincides with the trajectory of bullet.
- a sighting device of virtual camera comprising: projectile portion for firing projectile like bullet ,laser or strong water jet; camera portion for capturing image of target; and image processing portion for synthesizing the image of virtual camera whose line of view coincides with the trajectory of projectile like bullet or laser, where the virtual camera is a conceptual camera used in image based rendering technology.
- Figure 1 is the gun with telescope and target.
- Figure 2 is the gun with telescope whose line of view is adjusted.
- Figure 3 is the gun with the telescope and periscope whose line of view coincides with the trajectory of the bullet of gun.
- Figure 4 is the gun with virtual camera by two physical cameras.
- Figure 5 are the synthesized images of human face rendered at different view points.
- Figure 6 is the image captured by 1st physical camera (Cl) in Figure 4.
- Figure 7 is the image captured by 2nd physical camera (C2) in Figure 4.
- Figure 8 is the synthesized image of the virtual camera (CV).
- Figure 9 is the relation among the stereo images and its difference image.
- Figure 10 is the graph of function of overlap and the parallel shift of stereo images.
- Figure 11 is the relation among the parallel shifted input stereo images and the synthesized image.
- Figure 12 is the flow chart for synthesizing the image of virtual camera.
- Fig.1 shows the conventional sighting device mounted on a gun.
- the viewing line(L) of sighting device of telescope(A) and the trajectory of bullet of the gun is parallel but there is interval(D) between the two parallel lines , so the bullet will hit the lower point(O) beneath the center of target point(P) even if the shooter aim the center of target(P) with the sighting device.
- This problem is because the viewing line(L) of sighting device of telescope(A) and the trajectory (S) of bullet does not coincide.
- it is necessary to change the viewing line of sighting device of telescope so that the viewing line of sighting device and the trajectory of bullet meet at the target as shown in Fig.2. Such an adjustment dependents on the distance between the gun and the target.
- the object of the present invention is to provide a sighting device of virtual camera whose viewing line coincides with bullet ,so there is no requirement to adjust the viewing line of sighting device.
- Fig.3 shows an example of virtual camera.
- camera(C) of image sensor and display device(E) for displaying the image captured by the camera(C) are mounted on the gun like the telescope(A) in Fig.l.
- mirrors like periscope in front of the camera(C) and the gunpoint so that shooter can watch the target(O) through the display(E) where the shooter s viewing line(L) coincides with the trajectory(S) of bullet.
- the image captured by the real physical camera(C) with the mirrors (M1,M2) is the same one captured by the imaginary camera(CV) inside gun.
- the imaginary camera(CV) inside the gun is the virtual camera.
- the virtual camera(CV) is drawed with dotted line and the physical real camera is drawed with solid line.
- the mirror(M2) must be removed from the trajectory of bullet for firing and must be come back to its original position after firing for next sighting.
- the present invention has no such moving parts. Instead, it contains the virtual camera whose output image is synthesized by the images of two or more physical cameras around the barrel of gun. The image synthesis is processed by the image processing program in digital signal processor or micro processor.
- Fig.4 shows the sighting device of virtual camera with the physical cameras attached upside and downside of the barrel of gun like binocular telescope whose viewing line is parallel with the barrel of gun.
- the position of physical cameras can be right and left of barrel of gun.
- the number of physical cameras can be greater than 2.
- Such configurations are all equivalent and the embodiment of Fig.4 is the simple example among them. It is recommended to use the equal cameras, to attach the cameras symmetrically upside and downside of the gun, and to align the lens of up and down cameras to the same vertical plane for simple image processing.
- Two images of two cameras (C1,C2) are used as input for the image processing portion to synthesize the output image.
- Image processing portion contains image grabber which capture dynamic images(DDD?) from two or more cameras simultaneously and image processing program running in digital signal processor or microprocessor.
- the synthesized output image is the one captured from the virtual camera(CV) whose viewing line coincides with the trajectory of bullet.
- Such synthesis of image have been studied in the technology of 'image based rendering ' and published in the book whose title is 'image based rendering ' with ISBN 0-387-2113-6.
- the image base rendering was used in SF movie matrix to rotate the viewing direction of virtual camera around main character 'neo '.
- Such synthesis of image of virtual camera is a kind of interpolation of images of physical cameras.
- Fig.5 shows an example of synthesized images.
- the algorithms to synthesize image of virtual camera have been being developed and suggested and any of them can be used for the image processing portion of the present invention.
- Fig.6 is the image captured by the 1st physical camera(Cl) of Fig.4
- Fig.7 is the image captured by the 2nd physical camera(C2) of Fig.4.
- Fig.8 is the synthesized output image with input images of Fig.6 and Fig.7 by image processing portion.
- the image of Fig.8 is the image captured by the virtual camera(VC) in Fig.4.
- the shooter can watch synthesized output image on display (E) for aiming at target.
- the shooter don t have to manually adjust sighting device as shown in Fig.2 where the sighting device of telescope(A) in Fig.2 must be adjusted so that the viewing line(L) of telescope(A) and the trajectory(S) of bullet meet at target(O).
- the display(E) for output image can be small LCD monitor mounted on gun or wearable display like HMD(head mounted display) and NDE (near eye display).
- the difference image of two input images is well known concept in image processing technology and defined as the image whose pixel value is the absolute value of the difference of corresponding pixels of two input images.
- the pixel value Difflmg[x][y] at (x,y) of difference image is the absolute value of difference of FirstSiftImg[x][y] and SecondShiftImg[x][y] where FirstS iftlmg[x][y] is the pixel value at(x,y) of parallel translated 1st image and Sec- ondShiftImg[x][y] is the pixel value at(x,y) of parallel translated 2nd image.
- FirstS iftlmg[x][y] is the pixel value at(x,y) of parallel translated 1st image
- Sec- ondShiftImg[x][y] is the pixel value at(x,y) of parallel translated 2nd image.
- the degree of overlap is a function f(h) of parallel translation distance h and is defined as sum of all pixel values of difference image divided by the number of total pixels of difference image where the difference image is defined only in the intersection region of two input images.
- a degree of overlap f(h) is shown in Fig. 10.
- the best overlapping state corresponds to the point (hmin, Fmin) in Fig.10 at which degree of overlap value is minimum.
- the image processing portion can find the best overlapping point (hmin, Fmin) by calculating the degree of overlap for all possible parallel translation distance h and searching the minimum point from the graph of degree of overlap.
- Output image can be synthesized by translating the input images(p6,p7) by the best distance (hmin) in y direction and compositing them as shown in Fig.11.
- the compositing parallel translated input images means that the pixels of lower half region (SD) of synthesized output image(Syn) are obtained by copying the pixel values from the 2nd parallel translated image (p7(-h)) and the pixels of upper half region(SU) of synthesized output image(Syn) are obtained by copying the pixel values from the 1st translated image(p6(h)) where the 2nd parallel translated image (p7(-h)) is obtained by parallel translating the 2nd image (p7) in y direction by distance -h and the 1st translated image(p6(h)) is obtained by parallel translating the 1st image (p6) in y direction by distance h.
- the Ist(p6) and 2nd(p7) image before parallel translation are drawn with the dotted line in Fig.11 and the parallel translated images(p6(h),p7(-h)) are drawn with the solid line.
- the upper half region of synthesized output image (Syn) in Fig.11 is drawn with gray tone which is copied from the parallel translated 1st image(p6(h)) where the upper half region and the lower half region are divided by the horizontal line passing the center of synthesized output image (Syn). It is recommended to blend the pixels from different input images around the horizontal lines passing the center of synthesized image in order to make more natural output image.
- the data flows are summarized as following:
- the target may have surface of convex hull or concave and the synthesized output image may contain some distortion. But more complicated algorithm of image based rendering can be used to remove such a distortion.
- the algorithm in Fig.12 is just an example of image based rendering for the present inventi on, and any other algorithm may be used in the image processing portion of present invention. If the target is far enough away then the distortion can be ignored because the object of present invention is not the exact representation of the shape of target but just sighting.
- the sighting device of present invention can be used not only for gun but also for arbitrary projectile system such as canon, missile, laser gun, laser knife for surgical operation, and strong water jet of machine tool.
Landscapes
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
Abstract
Sighting device for projectile like bullet of gun, laser, and strong water jet. The viewing line of sighting device of virtual camera coincides with the trajectory of projectile so it is not necessary to adjust the viewing line of sighting device where the virtual camera is the conceptual camera whose output image is synthesized with real physical images by image processing portion.
Description
Description SIGHTING DEVICE USING VIRTUAL CAMERA
Technical Field
[1] The present invention relates to sighting device like telescope mounted on gun, and more particularly, to sighting device by virtual camera whose viewing line coincides with the trajectory of bullet.
[2]
Background Art
[3] In general, It is necessary to observe the target with sighting device in order to hit far target. But the line of view of conventional sighting device like telescope or physical camera mounted on gun does not coincides exactly with the trajectory of bullet of gun, so it is necessary to correct the viewing line of sighting device. If the distance between gun and the target changes then the line of view of sighting device must be carefully adjusted. Such a adjusting skill requires expensive training of shooter.
Disclosure of Invention Technical Problem
[4] To solve the problem, it is an object of the present invention to provide a sighting device of virtual camera which requires no adjustment of line of view of sighting device. Technical Solution
[5] To achieve the object, there is provided a sighting device of virtual camera comprising: projectile portion for firing projectile like bullet ,laser or strong water jet; camera portion for capturing image of target; and image processing portion for synthesizing the image of virtual camera whose line of view coincides with the trajectory of projectile like bullet or laser, where the virtual camera is a conceptual camera used in image based rendering technology.
[6]
Advantageous Effects
[7] By using the sighting device of the present invention, It is not necessary to adjust the line of view of sighting device and we can easily hit very far target. For example, laser gun on satellite can exactly hit small target on the earth without any manual adjustment of line of view of sighting device. Brief Description of the Drawings
[8] Figure 1 is the gun with telescope and target.
[9] Figure 2 is the gun with telescope whose line of view is adjusted.
[10] Figure 3 is the gun with the telescope and periscope whose line of view coincides with the trajectory of the bullet of gun.
[11] Figure 4 is the gun with virtual camera by two physical cameras.
[12] Figure 5 are the synthesized images of human face rendered at different view points.
[13] Figure 6 is the image captured by 1st physical camera (Cl) in Figure 4.
[14] Figure 7 is the image captured by 2nd physical camera (C2) in Figure 4.
[15] Figure 8 is the synthesized image of the virtual camera (CV).
[16] Figure 9 is the relation among the stereo images and its difference image.
[17] Figure 10 is the graph of function of overlap and the parallel shift of stereo images.
[18] Figure 11 is the relation among the parallel shifted input stereo images and the synthesized image.
[19] Figure 12 is the flow chart for synthesizing the image of virtual camera.
[20] <Description of symbols in drawings>
[21] G : gun L : line of viewing
[22] S : trajectory of bullet A : sighting telescope
[23] CV : virtual telescope M1,M2 : mirror
[24] E : display device Pu : center of 1st image
[25] Pd : center of 2nd image Prj : projected image
[26] P6 : 1st image P7 : 2nd image
[27] h : distance of parallel translation p6(h) : parallel translated 1st image
[28] P7(-h) : parallel translated 2nd image Syn : synthesized image
[29] SU : upper half portion of synthesized image
[30] SD : lower half portion of synthesized image
[31]
Best Mode for Carrying Out the Invention
[32] Fig.1 shows the conventional sighting device mounted on a gun. The viewing line(L) of sighting device of telescope(A) and the trajectory of bullet of the gun is parallel but there is interval(D) between the two parallel lines , so the bullet will hit the lower point(O) beneath the center of target point(P) even if the shooter aim the center of target(P) with the sighting device. This problem is because the viewing line(L) of sighting device of telescope(A) and the trajectory (S) of bullet does not coincide. In order to solve this problem, it is necessary to change the viewing line of sighting device of telescope so that the viewing line of sighting device and the trajectory of bullet meet at the target as shown in Fig.2. Such an adjustment dependents on the distance between the gun and the target. In real battle field, It is too urgent to adjust the viewing line of sighting device. Therefore the good shooting skill is obtained only after
time spending expensive training. The object of the present invention is to provide a sighting device of virtual camera whose viewing line coincides with bullet ,so there is no requirement to adjust the viewing line of sighting device. Fig.3 shows an example of virtual camera. In Fig.3, camera(C) of image sensor and display device(E) for displaying the image captured by the camera(C) are mounted on the gun like the telescope(A) in Fig.l. There are two mirrors (M 1,M2) like periscope in front of the camera(C) and the gunpoint so that shooter can watch the target(O) through the display(E) where the shooter s viewing line(L) coincides with the trajectory(S) of bullet. In this case, the image captured by the real physical camera(C) with the mirrors (M1,M2) is the same one captured by the imaginary camera(CV) inside gun. The imaginary camera(CV) inside the gun is the virtual camera. In Fig.3, the virtual camera(CV) is drawed with dotted line and the physical real camera is drawed with solid line. The mirror(M2) must be removed from the trajectory of bullet for firing and must be come back to its original position after firing for next sighting. The present invention has no such moving parts. Instead, it contains the virtual camera whose output image is synthesized by the images of two or more physical cameras around the barrel of gun. The image synthesis is processed by the image processing program in digital signal processor or micro processor.
[33]
[34] Fig.4 shows the sighting device of virtual camera with the physical cameras attached upside and downside of the barrel of gun like binocular telescope whose viewing line is parallel with the barrel of gun. The position of physical cameras can be right and left of barrel of gun. And the number of physical cameras can be greater than 2. Such configurations are all equivalent and the embodiment of Fig.4 is the simple example among them. It is recommended to use the equal cameras, to attach the cameras symmetrically upside and downside of the gun, and to align the lens of up and down cameras to the same vertical plane for simple image processing. Two images of two cameras (C1,C2) are used as input for the image processing portion to synthesize the output image. Image processing portion contains image grabber which capture dynamic images(DDD?) from two or more cameras simultaneously and image processing program running in digital signal processor or microprocessor. The synthesized output image is the one captured from the virtual camera(CV) whose viewing line coincides with the trajectory of bullet. Such synthesis of image have been studied in the technology of 'image based rendering ' and published in the book whose title is 'image based rendering ' with ISBN 0-387-2113-6. The image base rendering was used in SF movie matrix to rotate the viewing direction of virtual camera around main character 'neo '. Such synthesis of image of virtual camera is a kind of interpolation of images of physical cameras. The detail description of the technology can be found in
http://www.cse.ucsc.edu/~tao/ps/paper086.pdf and Fig.5 shows an example of synthesized images. The algorithms to synthesize image of virtual camera have been being developed and suggested and any of them can be used for the image processing portion of the present invention. Fig.6 is the image captured by the 1st physical camera(Cl) of Fig.4 and Fig.7 is the image captured by the 2nd physical camera(C2) of Fig.4. Fig.8 is the synthesized output image with input images of Fig.6 and Fig.7 by image processing portion. In other words, the image of Fig.8 is the image captured by the virtual camera(VC) in Fig.4. The shooter can watch synthesized output image on display (E) for aiming at target. By using the sighting device of present invention ,the shooter don t have to manually adjust sighting device as shown in Fig.2 where the sighting device of telescope(A) in Fig.2 must be adjusted so that the viewing line(L) of telescope(A) and the trajectory(S) of bullet meet at target(O). The display(E) for output image can be small LCD monitor mounted on gun or wearable display like HMD(head mounted display) and NDE (near eye display). [35] The followings are the detail description about the steps to synthesize the output image of Fig.8 from the 1st input image of Fig.6 and 2nd input image of Fig.7 and the steps are represented as flow chart in Fig.12. Let s assume that the 1st image(p6) of Fig.6 and the 2nd image(p7) of Fig.7 are images on films of analog camera and let s align the two films as shown in Fig.9 where the films and xy plane are parallel and the z axis passes the center of films(Pd,Pu) with the right hand coordinate system. Let s parallel translate the images(P6,P7) in y direction(h,-h) and observe the projected image(Prj) with the projecting light parallel in z axis. Find the best parallel translation distance(hmin) at which the projected image(Prj) is the clearest and best overlapped where the well overlapping means that most of the pixels of the difference image of two images (P6,P7) have very small pixel values. It is recommended to select the distances of parallel translation of two images(P6,P7) equal and opposite(h,-h) because the two cameras (C 1,C2) are symmetrically positioned with the center of symmetry at the trajectory (S) of bullet. If the two camera(Cl,C2) are positioned left and right of the trajectory(S) of bullet then the direction of parallel translation must be in x axis. It is also possible to use 4 cameras up, down, left and right like the shape of '+ ' for better synthesized output image. The difference image of two input images is well known concept in image processing technology and defined as the image whose pixel value is the absolute value of the difference of corresponding pixels of two input images. For example, the pixel value Difflmg[x][y] at (x,y) of difference image is the absolute value of difference of FirstSiftImg[x][y] and SecondShiftImg[x][y] where FirstS iftlmg[x][y] is the pixel value at(x,y) of parallel translated 1st image and Sec- ondShiftImg[x][y] is the pixel value at(x,y) of parallel translated 2nd image. In other words:
[36]
[37] Difflmg[x][y] = I FirstSiftImg[x][y] - SecondShiftImg[x][y] I
[38]
[39] The degree of overlap is a function f(h) of parallel translation distance h and is defined as sum of all pixel values of difference image divided by the number of total pixels of difference image where the difference image is defined only in the intersection region of two input images. Such a degree of overlap f(h) is shown in Fig. 10. The best overlapping state corresponds to the point (hmin, Fmin) in Fig.10 at which degree of overlap value is minimum. The image processing portion can find the best overlapping point (hmin, Fmin) by calculating the degree of overlap for all possible parallel translation distance h and searching the minimum point from the graph of degree of overlap. Output image can be synthesized by translating the input images(p6,p7) by the best distance (hmin) in y direction and compositing them as shown in Fig.11. The compositing parallel translated input images means that the pixels of lower half region (SD) of synthesized output image(Syn) are obtained by copying the pixel values from the 2nd parallel translated image (p7(-h)) and the pixels of upper half region(SU) of synthesized output image(Syn) are obtained by copying the pixel values from the 1st translated image(p6(h)) where the 2nd parallel translated image (p7(-h)) is obtained by parallel translating the 2nd image (p7) in y direction by distance -h and the 1st translated image(p6(h)) is obtained by parallel translating the 1st image (p6) in y direction by distance h. The Ist(p6) and 2nd(p7) image before parallel translation are drawn with the dotted line in Fig.11 and the parallel translated images(p6(h),p7(-h)) are drawn with the solid line. The upper half region of synthesized output image (Syn) in Fig.11 is drawn with gray tone which is copied from the parallel translated 1st image(p6(h)) where the upper half region and the lower half region are divided by the horizontal line passing the center of synthesized output image (Syn). It is recommended to blend the pixels from different input images around the horizontal lines passing the center of synthesized image in order to make more natural output image. The data flows are summarized as following:
[40] 1st camera ->lst image -> parallel translated 1st image by distance hmin in y direction
[41] 2nd camera ->2nd image -> parallel translated lnd image by distance hmin in y direction
[42] -> synthesized output image
[43] and the flow chart of these steps is drawn in Fig.12.
[44] In real situation, the target may have surface of convex hull or concave and the synthesized output image may contain some distortion. But more complicated algorithm of image based rendering can be used to remove such a distortion. The
algorithm in Fig.12 is just an example of image based rendering for the present inventi on, and any other algorithm may be used in the image processing portion of present invention. If the target is far enough away then the distortion can be ignored because the object of present invention is not the exact representation of the shape of target but just sighting.
[45] The sighting device of present invention can be used not only for gun but also for arbitrary projectile system such as canon, missile, laser gun, laser knife for surgical operation, and strong water jet of machine tool.
Claims
Claims
[1] A sighting device comprising: a projectile portion for firing projectile to target; a camera portion for capturing image of target; and a image processing portion for outputting target image of virtual camera whose viewing line coincides with the trajectory of the projectile. [2] The camera portion of claim 1, comprising cameras positioned around the trajectory of projectile for capturing the image of target; and the image processing portion of claim 1 synthesizes the output target image by checking the correlation between the captured images of the cameras of camera portion. [3] The image processing portion of claim 2 synthesizes the output target image by the image based rendering technology. [4] The image processing portion of claim 2 synthesizes the output target image by selecting and copying the pixels from the parallel translated input images, where the parallel translation distance is obtained by observing the degree of overlap which is the sum of all pixel values of difference image of input images divided by the number of difference image of input. [5] The camera portion of claim 2 comprising 1st camera positioned at upside of trajectory of projectile and 2nd camera positioned at downside of trajectory of projectile; and
The image processing portion calculates the degree of overlap by parallel translating upward the 1st image of the 1st camera and parallel translating downward the 2nd image of the 2nd camera. [6] A method of synthesizing the output target image of virtual camera whose viewing line coincides with the trajectory of projectile with the given input images of physical cameras positioned around the trajectory of projectile comprising: step 1 of calculating the degree of overlap for each possible parallel translation distance of input images, step 2 of finding the best parallel translation distance at which the degree of overlap of stepl is minimum; step 3 of obtaining the parallel translated images by parallel translating the input images by the best parallel translation distance obtained in step 2; step 4 of synthesizing the output image by copying pixels from the parallel translated images of step 3; [7] A method according to claim 6, characterized in that the degree of overlap of
step 1 is the sum of degree of overlap of all possible pair of two input images and the degree of overlap of pair of two input images is the sum of all pixel value of difference image of two input images divided by the number of pixels of difference image of two input images. [8] A method according to claim 6, characterized in that the number of camera is two and the 1st camera is positioned at the upside of trajectory of projectile and the 2nd camera is positioned at downside of trajectory of projectile; the parallel translation of input images of step 1 is in the vertical direction; the upper half region of output image of step 4 is obtained by copying the pixels from the image which is obtained by parallel translating the image captured by the 1st camera and the remainder of the output image of step 4 is obtained by copying the pixels from the image which is obtained by parallel translating the image captured by the 2nd camera.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20060034560 | 2006-04-17 | ||
| KR10-2006-0034560 | 2006-04-17 | ||
| KR1020070034468A KR20070102942A (en) | 2006-04-17 | 2007-04-08 | Aim device using virtual camera |
| KR10-2007-0034468 | 2007-04-08 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2007119951A1 true WO2007119951A1 (en) | 2007-10-25 |
Family
ID=38609687
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2007/001709 Ceased WO2007119951A1 (en) | 2006-04-17 | 2007-04-09 | Sighting device using virtual camera |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2007119951A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101833252A (en) * | 2009-03-09 | 2010-09-15 | 施乐公司 | Reusable paper media and printers with incompatible media sensors |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4786966A (en) * | 1986-07-10 | 1988-11-22 | Varo, Inc. | Head mounted video display and remote camera system |
| GB2375385A (en) * | 2001-03-09 | 2002-11-13 | Sagem | Weapon fire control system |
| US20040031184A1 (en) * | 2000-01-14 | 2004-02-19 | Hope Richard W. | Optical imaging device for firearm scope attachment |
| KR200410513Y1 (en) * | 2005-12-26 | 2006-03-07 | 주식회사 엘림시스 | Military Remote Precision Observer |
-
2007
- 2007-04-09 WO PCT/KR2007/001709 patent/WO2007119951A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4786966A (en) * | 1986-07-10 | 1988-11-22 | Varo, Inc. | Head mounted video display and remote camera system |
| US20040031184A1 (en) * | 2000-01-14 | 2004-02-19 | Hope Richard W. | Optical imaging device for firearm scope attachment |
| GB2375385A (en) * | 2001-03-09 | 2002-11-13 | Sagem | Weapon fire control system |
| KR200410513Y1 (en) * | 2005-12-26 | 2006-03-07 | 주식회사 엘림시스 | Military Remote Precision Observer |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101833252A (en) * | 2009-03-09 | 2010-09-15 | 施乐公司 | Reusable paper media and printers with incompatible media sensors |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11092796B2 (en) | Long range infrared imager systems and methods | |
| US8138991B2 (en) | Real-time image scanning and processing | |
| US9036035B2 (en) | Rifle scope with video output stabilized relative to a target | |
| US9711072B1 (en) | Display apparatus and method of displaying using focus and context displays | |
| EP2391142B1 (en) | Imaging device, method and program, and recording medium using same | |
| JP4072330B2 (en) | Display device and information processing method | |
| JP4517664B2 (en) | Image processing apparatus and method, program, and recording medium | |
| US8336777B1 (en) | Covert aiming and imaging devices | |
| US20090040308A1 (en) | Image orientation correction method and system | |
| EP2172688B1 (en) | Lighting apparatus | |
| CA2838764C (en) | Multiple spectral single image sighting system using single objective lens set | |
| RU2722771C1 (en) | Optical-electronic surveillance device for ground vehicle | |
| KR20210005555A (en) | Mixed reality optical system with digitally corrected aberration | |
| CN109997067B (en) | Display device and method using portable electronic device | |
| JP5905585B2 (en) | Video processing system based on stereo video | |
| EP3055730B1 (en) | Electronic eyebox | |
| JP6700935B2 (en) | Imaging device, control method thereof, and control program | |
| WO2007119951A1 (en) | Sighting device using virtual camera | |
| DK2362261T3 (en) | Real-time image scanning and image processing | |
| KR102485302B1 (en) | Portable image display apparatus and image display method | |
| KR20070102942A (en) | Aim device using virtual camera | |
| CN1624414A (en) | gun electronic sighting device | |
| JP2011244184A (en) | Image input apparatus, image input method and image input program | |
| JPH11331731A (en) | Head mounted image display device and image display method thereof | |
| EP4280618B1 (en) | Image blending |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07745872 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07745872 Country of ref document: EP Kind code of ref document: A1 |