| MIL 10 User Guide
| Customize Help

Rotation



See also
Availability
Not available in MIL-Lite

Available in MIL

There are two ways to search for a model that can appear at different angles using the MpatFindModel() function:

  • Search for rotated versions of the model.

  • Search for models taken from the same region in rotated images.

The following describes how to define models to perform these types of searches.

To implement the first type of search, allocate an M_NORMALIZED model with MpatAllocModel(). Then, enable and specify the angular range in which it can appear with MpatSetAngle(). When you call MpatPreprocModel(), it will internally create different models by rotating the original model at the required angles, assigning "don't care" pixels to regions that do not have corresponding data in the original model.

This method should only be used when the pixels surrounding the model follow no predictable pattern (for example, when searching for loose nuts and bolts lying on a conveyor).

To implement the second type of search, first allocate an M_NORMALIZED + M_CIRCULAR_OVERSCAN model with MpatAllocModel(); this will extract the model as well as circular overscan data from the model source image (specifically, MIL extracts the region enclosed by a circle which circumscribes the model). Second, enable and specify the angular range in which the model can appear with MpatSetAngle(). When you call MpatPreprocModel(), it will extract different orientations of the model from the overscanned model.

This type of model should only be used when the model's distinct features lie in the center of the region, so that they are included in all models when rotated. Therefore, it is recommended that an M_NORMALIZED + M_CIRCULAR_OVERSCAN model be as square as possible: the longer the rectangle, the smaller the number of consistent central pixels in every model.

As mentioned, a larger region than the one defined will be fetched from the model source image. Therefore, the model must not be extracted from a region too close to the edge of the model source image.

The pixels surrounding the model should be relevant to the positioning of the pattern (that is, the model should appear in the target image with the same overscan data). An example is the image of an integrated circuit.

Both methods find the position and match score of the model in a target image.

Finally, it should be noted that MIL's implementation of MpatFindModel() with a M_CIRCULAR_OVERSCAN type model is significantly faster than that of a M_NORMALIZED model when performing an angular search.

Setting the angle of search

By default, the nominal angle of the search is 0°. However, using MpatSetAngle(), you can change the nominal angle, as well as specify a rotational range of up to 360°. You can also specify the required precision of the resulting angle and the interpolation mode used to rotate the model. These settings can influence the speed of the search significantly. The accuracy of the search can also be influenced during preprocessing.

When an angular range has been specified with MpatSetAngle(), MpatPreprocModel() creates a model for every x degrees within the range, where x is determined by the specified rotational tolerance (M_SEARCH_ANGLE_TOLERANCE). Rotational tolerance defines the full range of degrees within which the pattern in the target image can be rotated from a model at a specific angle and still meet the acceptance level. The rotational tolerance can also be automatically determined according to the angular correlation of the model with versions of itself (at specific angles) when the model is preprocessed; to do so, set M_SEARCH_ANGLE_TOLERANCE to M_AUTO. Note that the model is rotated according to the interpolation mode specified using M_SEARCH_ANGLE_INTERPOLATION_MODE.

After the approximate location is found, MIL fine-tunes the search, according to the specified angle accuracy (M_SEARCH_ANGLE_ACCURACY). It searches within one rotational tolerance before and after the approximate location, at an angle refinement step determined by the specified angle accuracy. The greater the angle refinement step, the faster the search; however your results will be less accurate. Note that you must set the angle accuracy (angle refinement step) to a value smaller than that of the rotation tolerance.

If you don't know the best search angle accuracy to set, you can have MIL automatically select the angle refinement step. To do so, set M_SEARCH_ANGLE_ACCURACY to M_DISABLE. Then enable the angle refinement mode and specify the angle refinement score to use for this mode, using MpatSetAngle() with M_ROTATED_MODEL_MINIMUM_SCORE. This mode improves accuracy without unnecessarily sacrificing performance. Similar to automatically determining rotational tolerance, during preprocessing, the angle refinement mode determines the full range of degrees within which a rotated version of the model can be rotated from the model at a specific angle and still match the angle refinement score. The specified angle refinement score must be set to a value higher than the acceptance level. The higher the angle refinement score is set, the smaller the angle refinement step will be.

If you need a minimum accuracy, you can set M_SEARCH_ANGLE_ACCURACY to the required minimum accuracy and then enable the angle refinement mode. The lowest value between the determined angle refinement step and the specified angle accuracy is the angle refinement step used when searching for the pattern in the target image.

When searching within a range of angles, you should use as narrow a range, as high a tolerance, and as large an angle accuracy (angle refinement step) as possible, since the operation can take a long time to perform.

Determining the rotation tolerance of a model

Every model has its own particular rotation tolerance. This tolerance is dependent on the individual model characteristics and surrounding target image features. To determine the rotation tolerance of a model:

  • Set the search angle of a model to the same angle as the sought for pattern in a sample target image. However, set the positive and negative delta values to zero since you want to test by how much a pattern in an image can be rotated and still correlate with a model at a specific angle.

  • Use the MimRotate() function to rotate the image in very small, positive increments (for example, 0.5°), and perform a MpatFindModel() operation at every angle. Make sure that the image's center of rotation is the same as that of the model, otherwise the resulting tolerance will not be accurate. Note, when rotating the image, always set the angle from the image's original position to avoid interpolating the image more than once. Check the results for the greatest angle that produces an acceptable score.

  • Repeat steps 1 and 2, rotating the image in negative increments.

  • Take the minimum of the absolute value of these angles. Double this angle and set it as the rotation tolerance for the angular search.