Chapter 4. Materials and Methods
4.8. Surface Splitting
The problem here is to find the intersection between the waterline and the individual
NURBS curves trimming the each of the surfaces comprising the hull (Figure 4.12). I
considered the waterline to divide the affected surface in two. Then two new surfaces are
formed with a new sequence of trimming curves, the waterline being one of them.
If any waterline point is back-projected onto a surface , this is considered to be a “wetted
surface entity” (Figure 4.13). I check if the waterline crosses the edge of an even number
of times. If this happens, the splitting algorithm is executed; otherwise the surface is ignored.
Intersections with a surface boundary are determined as follows. The trimming curves of the
surface area are evaluated to obtain a vector of points forming a closed polygon. The
function inpolygon in the MATLAB Mapping Toolbox returns all waterline points inside the
trimming curve. This set of points is then extended by a few points in the directions of the
first derivative at the leftmost and rightmost point (in case the waterline terminates just
before the boundary). Then the MATLAB function polyxpoly is used to find the intersection
of the waterline and the trimming curve. It is now straightforward to construct two new
boundary polygons defining the original boundary polygon split by the waterline. This
gives a new set of trimming curves for each of the resulting split surfaces. A whole
picture of this process is represented in Figure 4.14. In order to know which of the divided
surfaces is wet, the mid-point of the waterline is selected inside the surface in the image
space and moved a small step downwards. I back-project this point to parameter space and
use inpolygon again to determine which of the two new surfaces the point belongs to. This
surface is then re-meshed to obtain a new set of nodes and triangles. This polygon
formulation is sufficient for area estimation
At this stage the new trimming curve is defined as the polygon given by a vector of
boundary points. In order to plot the waterline on different camera views using the
IGES2MATLAB plot package, the waterline vector must be converted to a NURBS curve.
This is achieved by treating each vector point as a NURBS control point. An approximation
is involved here, since NURBS curves do not usually pass through their control points.
However I use a dense vector of points so the error should be negligible. Although any
surface that is not split makes no contribution to the wetted area, all back-projected
waterline points are plotted in the 3D model shown here in this thesis and can be displayed
to the operator.
Jose M. Pérez-Macías Martín
Figure 4.12. Surface Splitting. The waterline treated as boundary of trimmed surface (for plotting and
area measurement).
Triangles containing waterline Control-Points in Parameter Space
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0 1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
Figure 4.13 Illustration of the surface before the splitting algorithm. Parameter space plot of the
surface and back-projected waterline. In red the triangles enclosing waterline points. In black, the
waterline points. In blue the mesh of the trimmed surface.
Chapter 4. Materials and Methods
Figure 4.14 Illustration of the splitting algorithm. (a) surface boundary in parameter space. (b) back-
projected waterline points within the boundary (c) waterline points extended to surface boundary (d)
intersection of boundary with waterline points (e) Polygon A: boundary of first split surface (f)
Polygon B: bouMndesahryreosfusltecoof nEdntsitpylintusmurbfearc4e05, TRIMMED SURFACE
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0 1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
Figure 4.15 Mesh result of one of the split surfaces.
Jose M. Pérez-Macías Martín
Algorithm 4.3. Split surface algorithm
Description: Given the selected wet surface including its trimming boundaries , its
. The
triangulation matrix T, the waterline points in parameter space , a selected wetted point ., and
idea is to divide the surface in two, select the wetted half trimmed by a new set of curves
re-mesh it, obtaining a new set of mesh nodes and a new triangulation matrix
Require:
1. Check if it is a trimmed or untrimmed surface
2. Read
3. Extend the line
4. Construct a new trimming boundaries for the wetted surface *
5. Check in which surface is the triangle and store
6. Re-mesh obtaining a new set where is the wetted surface
Return:
(*)Note: note that has a predefined orientation defined in the IGES standard
4.9. Area calculation
Given a wetted surface, , and its corresponding triangle mesh, R, area estimation
consists of adding the areas of the constituent triangles. The area of each triangle is given by
the vector cross-product of two of the sides, or alternatively by the
orthogonal-triangular decomposition [39].
The total area of each surface is the sum of the individual triangle areas, while the total
wetted area is obtained by adding the areas of the individual wetted surfaces.
Chapter 5
Test examples – Results
5.1. Introduction
A variety of tests for different aspects of the research are considered here: registration
accuracy, fidelity of waterline delineation, and precision of the back-projection and splitting
algorithm. Finally, complete system test results are given for the experimental model hull
together with the test camera images. Here the wetted surface area estimates obtained using
my proposed methods are compared with the manual estimates made by an SSPA expert. I
also give some timing estimates for different operations, although it should be stressed that
no systematic optimisation has been carried out.
5.2. Image registration from the underwater camera
Description: Image registration was done using correspondences between selected points in
the camera image and the 3D model. Point matching was carried out manually. Registration
of the underwater camera image was evaluated in two ways. Firstly, by perspective
projection of the RHINO registration points onto the underwater camera image. The
coordinates of these registration points were supplied by SSPA. This gives the registration
error in pixels. Secondly, by back-projection of the image registration points onto the
RHINO hull. This gives the registration error in millimetres.
In Figure 5.2, registration points marked by the operator are circled in white, while the
projected points using the estimated transformation matrix estimate are marked with red
dots. Boundaries of the boat are projected into the image. The error in pixels and on the
model surface for 10 registration points is given in Figure 5.3. The rms errors are 1.8 pixels
corresponding to 2.4 mm on the surface. Distortion of the hull boundaries above the
waterline is due to refraction.
Comment: Results were very satisfactory. The rms deviation of only 1.8 pixels was very low.
The standard deviation is lower that the mean at 0.8 pixels, suggesting a random error
distribution. Errors in mm are more variable, suggesting that other factors than operator and
painting errors, such as camera aberrations could be contributing.
-37-
Jose M. Pérez-Macías Martín
1000
800
600
400
200
0
-200
-400
-600 Results from the Image Registration
Figure 5.1. Registration points represented in the 3D model. In blue-redish points are the registration
points used for the estimation of H.
800 2000
600
400 1500
200
0
-200 1000
-400
-600 500
-800
0
Figure 5.2. Registration results of the underwater camera. White circles are the operator’s input. Red
points are the projected registration points using the estimated H. Red lines are the boundaries of
some of the surfaces. The black points and the blue waterline are not relevant for this section.
Chapter 5. Test Examples - Results Pixel error
Model Error
7
6
5
Eror in mm/pixels 4
3
2
1
0
1 2 3 4 5 6 7 8 9 10
Registration point
Figure 5.3 Pixel error and its corresponding error in the model in millimetres.
5.3. Image registration from the stern camera
Description: The registration points were chosen to be as spread as possible in the model
(Figure 5.4). Figure 5.5 shows the results. Figure 5.6 shows the projected boundaries of the
hull onto the picture (red), as well as the waterline detected from the underwater camera
(blue). The boat is severely foreshortened by perspective errors in the transformation matrix.
I then tried another set of registration points (Figure 5.7), which gave much better results
(Figure 5.8). Here the rms error is 2 pixels. The errors were too large for back-projection to
yield points lying on the surface, so the rms error in mm is not available.
Comment: Why is registration using the stern camera so much less reliable than registration
using the underwater camera? There are different possible explanations. Firstly, stern
camera registration is very sensitive to small errors in registration points along the side of
the hull, i.e. at surfaces almost parallel to the camera direction. Secondly, this set of
registration points was not provided by SSPA, so I attempted to determine them myself with
RHINO, leading to possible errors in the 3D specification, could be definition. Finally, it
should be noted that determination of the waterline end point appears to be correct, even if
features further from the camera are distorted. However this experience suggests that
multiple-view registration needs detailed investigation.
Jose M. Pérez-Macías Martín
1000
800
600
400
200
0
-200
-400 Results from the Image Registration
-600
Figure 5.4. Registration points distribution used for the stern camera. In blue-redish points are the
registration points used for the estimation of H.
800 2000
600 1500
400 1000
200 500
0
-200
-400
-600
-800
0
Figure 5.5. Registration results of the stern. White circles are the operator’s input. Red points are the
projected registration points using the estimated H.
400
450
500
550
600
650
700
750
800
Figure 5.6. Pro850jection of the surface boundaries and the detected waterline using the underwater
400 500 600 700 800 900 1000
camera. In blue the waterline, in red the boundaries of the surfaces the waterline crosses. In this case
the registration was wrong and therefore also the projection.
Chapter 5. Test Examples - Results
1000
800
600
400
200
0
-200
-400
0 -600
Figure 5.7 Second set of registration points distribution for the stern camera.
800 2000
200 600 1500
1000
400 500
200
0
-200
-400
400 -600
-800
0
600
800
Figure 5.8 Projection of the surface boundaries and the detected waterline from the underwater
camera using the second set of calibration points.
1000
1200
200 400 600 800 1000 1200 1400 1600
Jose M. Pérez-Macías Martín
5.4. Waterline Estimation
Description: Figure 5.9 shows the underwater image at zero speed, with no bubble
interference. The red line is the waterline marked by an SSPA expert. The black line shows
the waterline determined by the snake algorithm. The rms difference between them is 3.7
pixels, approximately 28 mm, allowing for the scale factor at the bottom of the hull. The
mean difference was 1.6 pixels, approximately 13 mm. Closer inspection shows that the
snake estimate often runs parallel to the manual estimate so there could be systematic
differences which the semi-automatic algorithm needs to consider. This mean error
translates into a fraction of a square metre error for the wetted surface.
The test was repeated for each towing speed (Figure 5.9, Figure 5.12, Figure 5.14, Figure
5.15). Figure 5.11 and Figure 5.13 are the weight maps using hot colour map. Average
computation time was 73 seconds including the time that took me to select the points and
the region.
Comment: It is not certain that the manual estimates are completely accurate. However the
difference between the semi-automatic and the manual estimates seems acceptable. Further
0
investigation of the differences would be worthwhile.
200
400
600
800
Figure 5.9 Waterline for 0 knots. In red, manual estimate of the waterline. In black, automated
1000
estimate of the waterline.
1200
1400
1600
500 1000 1500 2000 2500 3000
Figure 5.10 Histogram showing percentage of pixels for given pixel distance. Pixel percentage
deviation between the manual and the snake waterline.
Chapter 5. Test Examples - Results
finished!
1
100 0.8
200
0.6
300
0.4
400
500 0.2
600 0
500 1000 1500 2000 2500
Detected image
Figure 5.11. Weight map for 20 knots. Yellow circles are the operator’s input. Lines are the weights
us2i0n0 g a hot colour map.
400
600
800 Figure 5.12. Waterline for 20 knots. In red the waterline.
1000 500 1000 1500 2000 2500 1
2500
1201000 Detected image 0.8
200
500 1000 1500 2000 0.6
1403000
400 0.4
500 3000
600
0.2
200
0
400
Figure 5.13. Weight map for 25 knots.
600
Detected image
800
200 Figure 5.14. Waterline for 25 knots.
1000 500 1000 1500 2000 2500 3000
400
Figure 5.15. Waterline for 30 knots.
1200
600 1400
800
1000
1200
1400
500 1000 1500 2000 2500 3000
Jose M. Pérez-Macías Martín TABLE 5.1. TIME RESULTS
Speed Time
knots seconds
20 79.4
25 78.1
30 61.7
5.5. Back-projection tests
Overview: The purpose is to estimate the accuracy of the back-projection and splitting
algorithms, indicating how much error is introduced at this stage. Here synthetic data is
used. The perspective projection matrix used was derived from the registration of the
underwater camera image. However it is irrelevant which matrix is used, I chose this matrix
to overlap the results with the experimental images and give a clear indication of what is
happening.
Description: I modified the digital model in Rhinoceros and created a line onto one of the
surfaces of the boat. I name it “water” in order to recover it in MATLAB (Figure 5.16). I
projected this line into the picture with the estimated H (the projection matrix) together with
the boundaries of the surface of the boat (Figure 5.17). The line is back-projected from the
image space to the parameter space of the surface (Figure 5.18). The zoomed section (Figure
5.19) shows waterline points, and their enclosing triangles. Figure 5.20 shows the operations
of the splitting algorithm in parameter space, where two resulting surfaces are shown (k1
and k2). The wetted surface (k1) is re-meshed with the new contour line (Figure 5.21) and
finally the area of this surface is estimated and plotted in Figure 5.22. The areas estimated
using Rhinoceros software and MATLAB were compared and shown in TABLE 5.2.
Comment: This result suggest that the back-projection and splitting algorithms introduce
negligible error into the system. Therefore, errors will come from the registration and from
the waterline estimate in the image analysis.
TABLE 5.2 ESTIMATED AREA FROM AN IDEAL BOAT
Method Estimated area Difference
Rhinoceros %
MATLAB
230551.784 (+/- 0.019) 0.0721
230718.2
Chapter 5. Test Examples - Results
Plot Detected Waterline
Figure 5.16. Rhinoceros model of the boat and a artificial waterline.
Figure 5.17. Projection of the waterline.
Triangles containing waterline Control-Points in Parameter Space
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Figure 5.18. Parameter space plot of the surface. In red, the back-projected points of the waterline.
0.5
Jose M. Pérez-Macías Martín
0.4
0.3
Figure 5.19. Parameter space zoom of the surface, some waterline points and its enclosing triangles
(red). 0.2
0.1
0
0 0.1 0.2 0.3
Figure 5.20 Illustration of the splitting algorithm. (a) Surface boundary in parameter space. (b) Back-
projected waterline points within the boundary. (c) Waterline points extended to surface boundary
(d) intersection of boundary with waterline points. (e) Polygon A: boundary of first split surface. (f)
Polygon B: boundary of second split surface.
Chapter 5. Test Examples - Results
Mesh result of Entity number 354, B-NURBS SRF 1
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
Figure 5.21. Mesh result of the wetted surface.
Calibration points in the model space(3D)
300
200
100
0 0 500 1000 1500 2000
Figure 5.22. Wetted surface of the hull with hand-drawn waterline.
5.6. Complete system tests
Description: The waterline is traced in each camera image, taken at different speeds, and
projected back on to the model, where it divides each wetted surface. Figure 5.24, Figure 5.28
and Figure 5.32 shows the traced waterline together with the projection of the surface
boundaries. Figure 5.30 shows graphically the estimated wetted area. The area of the stern
was not considered. The manual and semi-automated area estimates are compared in
TABLE 5.3, which also shows the percentage difference between these estimates at different
towing speeds. Registration errors at individual registration points are given in Figure 5.23,
Figure 5.27, and Figure 5.31. Wetted area estimates were compared for different registrations
and waterline tracking operations at zero towing speed. For the same registration, the two
estimates were 70.2 and 71.8 , a difference of 2.8%. For the same waterline but
Jose M. Pérez-Macías Martín
different registrations, the estimates were 71.6 and 72.2 , a difference of 0.8%. The
average of these estimates is given in Table 5.3.
Comment: There seem to be systematic difference with towing speed, but there are too few
results to support any hypothesis. It would have been helpful to back-project the waterline
from the stern camera image, and use this when tracing the waterline in the underwater-
camera images. Unfortunately this could not be done. On the other hand the projections of
the underwater-camera waterlines on stern camera images (Figure 5.26, Figure 5.29 and
Figure 5.33) suggest that no great errors were made at the point where the waterline meets
the stern.
Further comments and recommendations: I believe that above cameras should be mounted
higher on the platform, giving a better view of the waterline, less obscured by waves. I also
believe that using the same side for the underwater camera images would be better than the
current arrangement, even though the waterline is assumed to be symmetric (See Figure 4.2).
TABLE 5.3 COMPARISON OF AREA ESTIMATES WITH MANUAL ESTIMATES
Speed Measured area by thesis Manual measurements Difference
Knots method
68.16 %
0 71.45
20.00 4.7
25.00
30.00 71.98 71.94 0
67.00 69.95 -4.2
60.57 62.94 -3.8
Test for 20 knots
3.5 3
Pixel error in x-axis Pixel error in x-axis
Pixel error in y-axis
3 Pixel error in y-axis
2.5
2.5
2
2
1.5
1.5
Pixel deviation
Pixel deviation
1
1
0.5
0.5
0 0
12 34 567 89 12 34 567 89
Registration point Registration point
(a) (b)
Figure 5.23 Results of the registration. (a) Registration error for the underwater picture.
(b) Registration error for the stern camera.
Plot Detected Waterline
Chapter 5. Test Examples - Results
Figure 5.24 Waterline (black dotted line). Boundaries of the affected surfaces (red).
Calibration points in the model space(3D)
300
200
100
0
300 0
200
100 500
0
0 -100 1000
200 -200
-300 1500
2000
Figure 5.25 Tri-dimensional plot of the wetted surface.
400
600
800 Figure 5.26 Re-projection of the waterline in the stern view.
1000
1200
Jose M. Pérez-Macías Martín
Test for 25 knots
64
Pixel error in x-axis Pixel error in x-axis
Pixel error in y-axis
Pixel error in y-axis 3.5
5
3
4
2.5
Pixel deviation
Pixel deviation
32
1.5
2
1
1 0.5
0 0
12 34 567 89 1234 56789
Registration point Registration point
(a) Plot Detected Waterline (b)
Figure 5.27 Registration results: (a) Underwater view. (b) Stern view.
0
Figure 5.28 Waterline estimate (black dotted points) and the boundaries of the surfaces (red).
200
400
600
800
Figure 5.29 Re-projection of the waterline on the stern view.
1000
1200
Chapter 5. Test Examples - Results
Calibration points in the model space(3D)
2310000000
0
200 500
0
1000
1500
Figure 5.3-02.0G0raphical represen2t0a0ti0on of the wetted surface estimated at 20 knots. (View from the
underwater camera).
Test for 30 knots
36 Pixel error in x-axis
Pixel error in x-axis Pixel error in y-axis
Pixel error in y-axis
2.5 5
2 4
1.5
1
0.5
0
12 34 567 89
Registration point
(a)
Pixel deviation 3
Pixel deviation
2
1
0
12 34 567 89
Registration point
(b)
Plot Detected Waterline
Figure 5.31 Registration results: (a) Underwater view. (b) Stern view.
Figure 5.32 Waterline estimate (black dotted points) and the boundaries of the surfaces (red).
0
Jose M. Pérez-Macías Martín
00
00
00
00
Figure 5.33 Re-projection of the waterline on the stern view.
00 5.7. Timing results
Description: The times taken for the whole operation of estimating the wetted surface at
three different towing speeds, including my own contribution as an operator, are given in
00 TABLE 5.4
TABLE 5.4. TIMING RESULTS
knots seconds
200 42000knots 600 800 1000 1200 1400196 1600 1800
25 knots 256
30 knots 245
Chapter 6
Discussion
Alternative Approaches
At the beginning of this thesis I considered the alternative strategies of importing image
analysis facilities into RHINO, or implementing 3D geometry facilities in MATLAB. It must
be said that the chosen approach proved considerably more difficult than anticipated.
Understanding of IGES2MATLAB software was mandatory to develop this thesis, and it
was the hardest and the most time-consuming part of this research. There were also low-
level tasks e.g., curve projection, meshing and surface splitting, which turned out to be
difficult. On the other hand, carrying out the research has given me a geometrical insight
which I would never have obtained, had I relied on the RHINO CAD system. Moreover the
range of MATLAB facilities I have exploited in the research is so wide that I suspect the
problems could have been equally hard choosing the alternative strategy of importing image
analysis facilities into RHINO.
My solution has two important advantages. Firstly, it provides a framework for using
improved image analysis techniques. The best example is the possibility to project points
from other viewpoints to improve waterline tracing. Also, in a dynamically changing
situation, a changing waterline could possibly be followed automatically over time. All this
would be much harder to test in an environment other than MATLAB, at least for research
and development purposes. Secondly my method offers an interface to other CAD systems
than Rhinoceros.
Development Status
A system has been developed which could be used for experimental testing. A large body of
code is available for further experimental work. There is a good basis to build on, and the
research is anchored in well-understood computer technology (NURBS, snakes, etc).
Programming issues
Where execution time is considered excessive, MATLAB operations could be speeded up
using “embedded code” facility. The splitting and meshing algorithms are already
implemented in C/C++, so could readily be incorporated.
-53-
Jose M. Pérez-Macías Martín
Future Work
Overview: This research focused on the geometric and image analysis aspects of the
problem. Other aspects were tackled and discussed, but there is a chance for improvement
and development.
Image registration
Unlike the current manual methods, the computer-assisted method is extremely sensitive to
accurate registration of each camera image used to estimate the wetted surface of the hull.
The method used in this research was registration of individual camera images with the
Rhinoceros hull specification. This method needs to be investigated further, using custom-
pained registration marks on the hull, and correcting for possible camera aberrations. If
accuracy is still unsatisfactory, multiple-view registration will need to be considered, in
which the unknowns are the position and attitude of the hull, the timing error of the
underwater camera, and possibly the sight-line directions for the above-water camera.
Registration points on the model hull could almost certainly be detected automatically,
particularly if they were chosen to stand out on one of the colour separations of the camera
images.
Image analysis and waterline edition
From the point of view of the image analysis for the waterline delineation. The snake
algorithm could be refined to take account of discontinuities in direction where the
waterline crosses surface boundaries. Information from other views could be used to
constrain the snake to pass through certain points.
Geometry
The surface splitting algorithm could be refined by executing image space allowing for
multiple intersections. An adaptive meshing algorithm should also be incorporated to the
system (for example, the one used in OpenSG).
Implementation
There is opportunity to build a GUI using the set of functions programmed in this project.
Chapter 5. Test Examples - Results
Chapter 7
Conclusions
General: This research has developed an effective method to compute the area of a 3D
surface given camera images and the CAD model. Both the method of computer-assisted
waterline tracing proposed here, and the geometrical interface between camera images and
CAD model of the hull proved successful. The research has shown how a small number of
points on the waterline selected by the operator can be combined with edge detection and a
snake algorithm to delineate waterline sections. The waterline is stored as a NURBS curve
defined in the surface of the vessel, so can be projected and modified by moving its control
points from any camera viewpoint.
Wetted area estimates using the thesis method have been compared with manual estimates
by SSPA showing a few percent difference. Part of this difference seems to be due to
waterline tracing, and part to image registration – an operation which is less critical for the
manual estimates.
Registration
Registration of the underwater images was accurate enough for successful back-projection
from camera image to hull surface. Registration of the above-water (stern) camera image
allowed the waterline to be plotted but insufficient registration points were not available for
accurate back-projection.
Image analysis
The waterline tracing algorithm proved robust but needs testing on a wider range of images.
Many variants of the snake approach were investigated in the course of the research, and
some of these deserve further investigation.
Geometry
The method of back-projecting the waterline onto the hull surface proved to be fast and
reliable, using the underwater camera images. Even though it is an approximation
dependent on the size of the triangles used in meshing, there was a good agreement
between the Rhinoceros and MATLAB measurements. While the method was demonstrated
to be very accurate using synthetic data, there was a ± 5% difference between experimental
area estimates and the estimates made manually by SSPA.
Jose M. Pérez-Macías Martín
The current surface splitting algorithm does not take into account many possible hull
shapes, although these are unlikely in these types of vessels. Alternative algorithms have
been considered, but it was not possible to implement them in the research time available.
References
[1] SSPA. [Online]. http://www.sspa.se/research/projects/hydro-testing-alliance-hta
[2] Hydro Testin Alliance - Network of Excellence. [Online]. http://www.hta-noe.eu/
[3] Les Piegl and Wayne Tiller, The NURBS Book.: Springer-Verlag, 2nd Edition, 1997.
[4] A. Ryberg, A-K Christiansson, and K. Eriksson, "Accuracy Investigation of a Vision
Based System for Pose Measurements".
[5] A. Ryberg, A-K. Christiansson, K. Eriksson, and B. Lennartson, "A new Camera Model
and Algorithms for improved Accuracy and Convergence in Pose Calculations".
[6] A. Ryberg, A-K. Christiansson, B. Lennartson, and K. Eriksson, "A new Camera Model
for Higher Accuracy Pose Calculations," 2006.
[7] Adrian Biran, Ship Hydrostatics and Stability.: Elsevier, 2003.
[8] Per Bergström. (2008) Mathworks. [Online].
http://www.mathworks.com/matlabcentral/fileexchange/loadFile.do?objectId=13253&
objectType=file
[9] Mark Spink. NURBS Toolbox for Matlab. [Online].
http://www.aria.uklinux.net/nurbs.php3
[10] Milan Sonka, Vaclav Hlavac, and Roger Boyle, Image Processing, Analysis, and Machine
Vision, 3rd Edition.: Thomson, 2008.
[11] Malcolm Slaney. (2009, February) Malcolm Slaney Homepage. [Online].
http://cobweb.ecn.purdue.edu/~malcolm/interval/1995-017/
[12] Peter Hultqvist and Hamidreza Taghavi, "Hull waterline detection. Image analysis
project HT2007," Göteborg, 2007.
[13] JPR8, "Wetted Surface Estimation," in Minutes, JPR8, Holland, 2007.
[14] McNeel. Rhinoceros NURBS Modelling for windows. [Online].
http://www.rhino3d.com/
[15] D. F. Rogers, An introduction to NURBS - with Historical Perspective.: Morgan Kaufnann
Publishers, 2001.
[16] Richard F. Riesenfeld, Gershon Elber Elaine Cohen, Geometric modeling with splines : an
introduction.: AK Peters, cop. 2001.
[17] US PRO. (1996) What is IGES5.x for download. [Online].
http://www.iges5x.org/archives/version5x/
[18] Per Bergström, "Computational Methods for ON-Line Shape Inspection," Luleå
University of Technology, Luleå, Licentiate 2009.
[19] Ming Xie, Rundamentals of Robotics. Linking perception to action. Singapore-MIT Alliance
& Nanyang Technological University, Singapore: World Scientific, 2002.
[20] Richard Hartley and Andrew Zisserman, Multiple view geometry in computer vision. 2nd
Ed.: Cambridge, 2003.
[21] C.-K. A User Guide to the Surface Subsystem of DesignMentor Schene. Design Mentor.
-57-
Jose M. Pérez-Macías Martín
[Online]. http://lumimath.univ-mrs.fr/~jlm/cours/DMmanuals/surface/
[22] The IGES 5.x Preservation Society (IPS) HomePage. [Online]. http://www.iges5x.org/
[23] Wikipedia. (2008, July) IGES. [Online]. http://en.wikipedia.org/wiki/IGES
[24] Les A Piegl and Wayne Tillert, "Geometry-based triangulation of trimmed NURBS
surfaces," Computer-Aided Design, vol. 30, no. 1, 1996.
[25] Bernd Hamann and Po-Yu Tsai, "A Tesselation algorithm for the representation of
trimmed NURBS surfaces with arbitray trimming curves," vol. 28, no. 6/7, 1996.
[26] Jonathan Richard Shewchuk. A Two-Dimensional Quality Mesh Generator and
Delaunay Triangulator. [Online]. http://www.cs.cmu.edu/~quake/triangle.html
[27] Darren Engwirda. MATLAB CENTRAL. [Online].
http://www.mathworks.com/matlabcentral/fileexchange/10307
[28] The University of Alabama at Birmingham Birminghan. Enabling Technology
Laboratory at UAB. [Online]. http://me-wiki.eng.uab.edu/etlab/?page_id=261
[29] Ákos Balázs, Michael Guthe, and Reinhard Klein, "Efficient trimmed NURBS
tessellation," vol. 12, no. 1-3, 2004.
[30] (2009, Jan) OpenSG. [Online]. http://opensg.vrsource.org/trac
[31] Alan Watt and Fabio Policarpo, 3D Games, Real-time Rendering and software Technology.
Volume One and Two.: Addison-Wesly, 2003.
[32] Anil. K. Jain, Fundamentals of digital Image processing.: Prentice Hall, 1988.
[33] Amir A. Amini, Terry E. Weymouth, and Ramesh C. Jain, "Using Dynamic
Programming for Solving Variational Problems in Vision," vol. 12, no. 9, 1990.
[34] J.B. Antoine Maintz and Max A. Viergever, "A survey of medical image registration,"
Medical Image Analysis, vol. 2, no. 1, pp. 1-36, 1998.
[35] Emanuele Trucco and Alessandro Verri, "Camera Parameters from the Projection
Matrix," in Introductory Techniques for 3-D Computer Vision.: Prentice Hall, 1998, ch. 6,
pp. 132-133.
[36] Richard Hartley and Richard Zisserman, "Basic equations," in Multiple View Geometry.,
ch. 7, pp. 178-179.
[37] (2009, Feb.) Wolfram. [Online]. http://mathworld.wolfram.com/NewtonsMethod.html
[38] Les Piegl and Wayne Tiller, "Point inversion and Projection for Curves and Surfaces,"
in The NURBS Book,2nd Edition, Springer-Verlag, Ed., 1997, ch. 6, pp. 229-234.
[39] Dirk-Jan Kroon. (2007, September) Matlab Central. [Online].
http://www.mathworks.com/matlabcentral/fileexchange/16448
[40] I. Reid, A. Zisserman A. Criminisi. A Plane measuring device. [Online].
http://www.robots.ox.ac.uk/~vgg/presentations/bmvc97/criminispaper/planedev.html
[41] Tomas Akenine-Möller and Eric Haines, Real-Time Rendering, 2nd Edition. Natick,
Massachusetts: AK Peters, 2002.
[42] 2007 Robert McNeel & Assoc. OpenNURBS initiative. [Online].
http://www.opennurbs.org/
[43] Per Bergström and Ove Edlund, "Robust Surface Registration for Optical Single-Shot
Measurements".
[44] Per Bergström, Ove Edlund, and Inge Söderkvist, "A method suitable for reiterated
matching of surfaces," 2007.
[45] Mangesh P. Bhandarkar, Blair Downie, Martin Hardwick, and Rakesh Nagi,
"Migrating from IGES to STEP: one to one translation of IGES," Computers in Inudustry,
vol. 41, no. 261-277, 2000.
[46] A. Criminisi, I. Reid, and A. Zisserman, "Single View Metrology".
[47] M. Pauline Barker Donald Hearn, Computer Graphics. C Version.: Prentice-Hall
International, Inc, 2nd Ed. 1997.
[48] Akemi Gálvez and Alberto Iglesias, "Numerical-Symbolic MATLAB Toolbox for
Computer Graphics and Differential Geometry," vol. Volume 3482, 2005.
[49] Simon Gilles, Andrew Zisserman, and Andrew W. Fitzgibbon, "Markerless Tracking
using Planar Structures in the Scene," Proc. International Smposium on Augmented
Reality, vol. 120-128, 2000.
[50] Sankarappan Gopalsamy, Douglas H. Ross, Yasushi Ito, and Alan M. Shih, "Structured
Grid Generation over NURBS and Facetted Surface Patches by Reparametrization,"
University of Alabama at Birmingham, AL, U.S.A,.
[51] David Jonsson, "Visualization of CAD Models - NURBS vs. Subdivision," Umeå, 8th
July 2005.
[52] McNeel. Rhino Scripting Resources. [Online]. http://blog.rhino3d.com/2008/05/rhino-
scripting-resources.html
[53] Perspective Transform Estimation. [Online].
http://alumni.media.mit.edu/~cwren/interpolator/
[54] Leslie A Piegl andArnaud M Richard, "Tessellating trimmed NURBS surfaces," vol. 27,
no. 1, 1995.
[55] Bruce Simpson. How efficient are Delaunay Refined Meshes? An Empirical Study.
[56] Joe F. Thompson, Bharat K. Soni, and Nigel P. Weatherill, Handbook of grid generation.:
CRC Press, 1999.
[57] TzuYi Yu and Bharat K. Soni, "Application of NURBS in numerical grid generation,"
Computer-Aided Design, vol. 27, no. 2, 1995.
[58] TrascenData. (2008, March) TrascenData. [Online].
www.transcendata.com/support/cadfix/faq/IGESexport.pdf
Appendix A
Estimation of the camera matrix
The relation between the 3-D coordinates (X, Y, Z) of a point in space and the 2-D
coordinates (x’, y’) of its projection on the image plane can be written as [35]:
(A.1)
with
(A.2)
(A.3)
where a 3x4 matrix is called projection matrix. The matrix is defined up to an arbitrary
scale factor. Assume that we have a number of point correspondences.
Then the entries of the H matrix can be determined through a
linear system formed by writing Eq. A.1. for space–image point matches (we need at least 6
space-image point matches). This will lead to the following equation:
,
(A.4)
with
A=
(A.5)
The vector h can be found using singular value decomposition of matrix A, as the singular
vector corresponding to the zero (in practice the smallest) singular value of A. The entries of
H are obtained up to an unknown scale factor.
-61-
Appendix B
Camera Specification
All three cameras attached to the platform were analogue and acquired pictures were
scanned with a FUJI PHOTO FILM CO., Ltd. SP-2000.
Pixel dimension X, Y 1840-1232
Resolution (inch) 72, 72
Color spcae sRGB
FileSource DSC
Orientation normal
Date
Model 2007-02-03 T 13.51
Fdi V 4.5/Frontier 350/370 7.6
Underwater camera5 Nikon D1X
3008, 1960
Model 300, 300
Pixel dimension X, Y
Resolution (inch) 35 mm
Cell size f/11 F-stop (lens)
Focal length f/4.4 (lens)
F-stop SRGL
Maximun aperture value Manual
Compress bits per pixel 23.7x15.15mm (3.67cm2)
Exposure program 1.4MP/cm2
Sensor size
Pixel density
5 http://www.dpreview.com/reviews/specs/Nikon/nikon_d1x.asp
-63-
Jose M. Pérez-Macías Martín