anisotropic filtering
TRANSCRIPT
A
SEMINAR REPORT
ON
“ANISOTROPIC FILTERING”
SUBMITTED IN THE PARTIAL FULFILLMENT OF THE REQUIREMENT
OF DEGREE OF BACHELOR OF TECHNOLOGY FROM
RAJASTHAN TECHNICAL UNIVERSITY, KOTA
IN COMPUTER SCIENCE ENGINEERING
SESSION-2010-2011
Submitted to - Submitted By –
Mr.Sunil Dhankhar Kangana Agrawal
(HOD CE Dept.) (8th Semester)
RAJASTHAN COLLEGE OF ENGINEERING FOR WOMEN
BHAKROTA, JAIPUR
(RAJASTHAN)
ACKNOWLEDGEMENT
In making this Seminar Presentation successful I have been benefited from the help and
support of my guide. My guide has given me his precious time and effort freely to make this
Seminar Presentation accurate and knowledge conveyable. So for this I would like to thank.
“Mr. SUNIL DHANKHAR” for providing me the opportunity to work on my topic of
seminar “ANISOTROPIC FILTERING” which is the emerging technology in coming
future, and he also shared his thoughts and opinion to the upbringing of my knowledge in this
topic.
I would also like to take this opportunity to acknowledge “Mr. V. S. TAMRA” our Seminar
Co-coordinator for all his guidance and co-operation.
Last but not the least; I thank my teacher, friends and my family members for their constant
encouragement and their thoughts which actually gave me a reason to present my seminar on
this future technology.
Kangana Agrawal.
C.S.E.
07ERWCS025
ABSTRACT
Anisotropic filtering is a technique used in the 3-D computing field to enhance the image
quality of textures on rendered surfaces which are sloped relative to the viewer. This is
achieved by eliminating aliasing, which is responsible for the jagged or pixilated quality of
some graphics. In addition to its anti-aliasing qualities, this filtering also reduces blurring of
sloped textures, an improvement over previous types of filtering known as bilinear and tri
linear filtering. An important distinction of anisotropic filtering as compared to other anti-
aliasing methods is that it affects only the textures on a shape, but not the shape itself.
Anisotropic filtering works by monitoring a given texture on a pixel-by-pixel basis, and
mapping a pattern based on the projected shape of the texture at each pixel.
An anisotropic filtering technique includes defining pixel elements in two dimensions and
defining at least one object having three dimensional surfaces in a three-dimensional model
space and storing texel elements in two dimensions defining a texture map bearing a
relationship to the three dimensional surfaces of the at least one object. Each pixel element to
be texture mapped is divided into a group of sub-pixel elements and the sub-pixel elements
are separately texture mapped. The resultant textures of the sub-pixel elements are averaged
to obtain a texture for their respective pixel element.
CONTENTS
1. INTRODUCTION TO ANISOTROPIC FILTERING
i) Basics
ii) Invention
iii) Definition
iv) Improvement on anisotropic MIP mapping
v) Degree of anisotropy supported
vi) Implementation
vii) Performance and optimization
viii) Anisotropy
ix) Fields of interest
a) Computer graphics
b) Chemistry
c) Real world imagery
d) Physics
e) Geophysics
f) Medical acoustics
g) Material science and engineering
h) Micro-fabrication
i) Neuroscience
j) Anisotropic Filtering
2. CHALLENGES IN ANISOTROPIC FILTERING
3. PROPERTIES OF ANISOTROPIC FILTERING
i) Advantages
ii) Disadvantages
4. ANISOTROPIC FILTERING EXPLAINED
i) Working
ii) Need
5. CLASSIFICATION OF ANISOTROPIC FILTERING
i) Bilinear
a) Formula
b) Limitations
ii) Tri-linear
iii) Anti-aliasing
6. APPLICATION OF ANISOTROPIC FILTERING
i) Quality and performance AF modes
ii) Image quality and performance comparison
7. ANTIALIASING AND ANISOTROPIC FILTERING
8. CONCLUSION
9. BIBLIOGRAPHY
10. REFERENCES
CHAPTER-1
INTRODUCTION TO ANISOTROPIC FILTERING
BASICS
In 3D computer graphics, anisotropic filtering (abbreviated AF) is a method of enhancing
the image quality of textures on surfaces that are at oblique viewing angles with respect to
the camera where the projection of the texture (not the polygon or other primitive on which it
is rendered) appears to be non-orthogonal (thus the origin of the word: "an" for not, "iso"
for same, and "tropic" from tropism, relating to direction; anisotropic filtering does not filter
the same in every direction). Anisotropic filtering is relatively intensive (primarily memory
bandwidth and to some degree computationally, though the standard space-time
tradeoff rules apply) and only became a standard feature of consumer-level graphics cards in
the late 1990s. Anisotropic filtering is now common in modern graphics hardware and is
enabled either by users through driver settings or by graphics applications and video games
through programming interfaces.
An illustration of texture filtering methods showing tri linear MIP map texture on the left and
enhanced with anisotropic texture filtering on the right
INVENSION OF ANISOTROPIC FILTERING
ALEXANDRIA, Va., Aug. 22 -- Walter E. Donovan of Saratoga, Calif., has developed an
apparatus and a method for using non-power of two texture maps with anisotropic filtering.
According to the U.S. Patent & Trademark Office: "An anisotropic perturbation is applied to
a texture map coordinate to produce a perturbed texture coordinate. A wrapped texture map
index for various wrap modes is computed using the perturbed texture coordinate and an
level of detail width. In addition to the anisotropic perturbation, the perturbed texture
coordinate may also include a tap perturbation."
The inventor was issued U.S. Patent No. 7,091,983 on Aug. 15.
Anisotropic filtering development occurred in the mid to late 1990s. Since then, the majority
of computer graphics cards have incorporated hardware support for the technique that can be
selectively turned on, off, or set to varying degrees of support depending on the user’s
preferences. When set to the maximum threshold, the graphics quality is high, but game
response may be slow depending on the number of computer resources available to the game.
The performance selections are enabled through the hardware support for the technique
which has matured over the early 2000s.
DEFINITION
In 3D graphics Anisotropic Filtering, abbreviated as AF, is a technique used to improve
image quality in computer video games. AF enhances the textures on surfaces that are far
away and at high angles (sloped relative to the camera view) so that the projection of the
texture appears more like a rectangle or trapezoid than a square. When AF is applied to the
sloped texture, the surface does not appear fuzzy to the viewer. Anisotropic Filtering is a
feature found on most 3D video cards today, however card manufacturers do not necessarily
use the same process to render AF.
AN IMPROVEMENT ON ISOTROPIC MIP MAPPING
If we were to explore a more approximate anisotropic algorithm, RIP mapping
(rectim in parvo) as an extension from MIP mapping, we can understand how
anisotropic filtering gains so much texture mapping quality.
If we need to texture a horizontal plane which is at an oblique angle to the camera,
traditional MIP map minification would give us insufficient horizontal resolution due
to the reduction of image frequency in the vertical axis. This is because in MIP
mapping each MIP level is isotropic, so a 256 × 256 texture is downsized to a 128 ×
128 image, then a 64 × 64 image and so on, so resolution halves on each axis
simultaneously, so a MIP map texture probe to an image will always sample an image
that is of equal frequency in each axis.
Thus, when sampling to avoid aliasing on a high-frequency axis, the other texture
axes will be similarly down sampled and therefore potentially blurred.
With RIP map anisotropic filtering, in addition to down sampling to 128 × 128,
images are also sampled to 256 × 128 and 32 × 128 etc. These anisotropically down
sampled images can be probed when the texture-mapped image frequency is different
for each texture axis and therefore one axis need not blur due to the screen frequency
of another axis and aliasing is still avoided.
Unlike more general anisotropic filtering, the RIP mapping described for illustration
has a limitation in that it only supports anisotropic probes that are axis-aligned in
texture space, so diagonal anisotropy still presents a problem even though real-use
cases of anisotropic texture commonly have such screen space mappings.
In layman's terms, anisotropic filtering retains the "sharpness" of a texture normally
lost by MIP map texture's attempts to avoid aliasing.
Anisotropic filtering can therefore be said to maintain crisp texture detail at all
viewing orientations while providing fast anti-aliased texture filtering.
DEGREE OF ISOTROPY SUPPORTED
Different degrees or ratios of anisotropic filtering can be applied during rendering and
current hardware rendering implementations set an upper bound on this ratio. This
degree refers to the maximum ratio of anisotropy supported by the filtering process.
So, anisotropic filtering will continue to sharpen more oblique textures beyond the
range sharpened by 2:1. In practice what this means is that in highly oblique texturing
situations a 4:1 filter will be twice as sharp as a 2:1 filter (it will display frequencies
double that of the 2:1 filter).
However, most of the scene will not require the 4:1 filter; only the more oblique and
usually more distant pixels will require the sharper filtering. This means that as the
degree of anisotropic filtering continues to double there are diminishing returns in
terms of visible quality with fewer and fewer rendered pixels affected, and the results
become less obvious to the viewer.
When one compares the rendered results of an 8:1 anisotropically filtered scene to a
16:1 filtered scene, only a relatively few highly oblique pixels, mostly on more
distant geometry, will display visibly sharper textures in the scene with the higher
degree of anisotropic filtering, and the frequency information on these few 16:1
filtered pixels will only be double that of the 8:1 filter.
The performance penalty also diminishes because fewer pixels require the data
fetches of greater anisotropy.
In the end it is the additional hardware complexity vs. these diminishing returns,
which causes an upper bound to be set on the anisotropic quality in a hardware
design.
Applications and users are then free to adjust this trade-off through driver and
software settings up to this threshold.
IMPLEMENTATION
True anisotropic filtering probes the texture anisotropically on the fly on a per-pixel basis for
any orientation of anisotropy. In graphics hardware, typically when the texture is sampled
anisotropically, several probes (texel samples) of the texture around the center point are
taken, but on a sample pattern mapped according to the projected shape of the texture at that
pixel. Each probe is often in itself a filtered MIP map sample, which adds more sampling to
the process. Sixteen tri linear anisotropic samples might require 128 samples from the stored
texture, as tri linear MIP map filtering needs to take four samples times two MIP levels and
then anisotropic sampling (at 16-tap) needs to take sixteen of these tri linear filtered probes.
PERFORMANCE AND OPTIMIZATION
The sample count required can make anisotropic filtering extremely bandwidth-intensive.
Multiple textures are common; each texture sample could be four bytes or more, so each
anisotropic pixel could require 512 bytes from texture memory, although texture
compression is commonly used to reduce this. A display can easily contain over a million
pixels, and the desired frame rate tends to be as high as 30–60 frames per second or more, so
the texture memory bandwidth can get very high (tens to hundreds of gigabytes per second)
very quickly. Fortunately, several factors mitigate in favor of better performance. The probes
themselves share cached texture samples, both inter- and intra-pixel. Even with 16-tap
anisotropic filtering, not all 16 taps are always needed, because only distant highly oblique
pixel fill tends to be highly anisotropic, and such fill tends to cover small regions of the
screen, and finally magnification texture filters require no anisotropic filtering.
ANISOTROPY
Anisotropy is the property of being directionally dependent, as opposed to isotropy which
implies identical properties in all directions. It can be defined as a difference, when measured
along different axes, in a material's physical property (absorbance, refractive index, density,
etc.) An example of anisotropy is the light coming through a polarizer. An example of an
anisotropic material is wood, which is easier to split along its grain than against it.
WMAP image of the (extremely tiny) anisotropies in the cosmic background radiation
FIELDS OF INTEREST
Computer Graphics-
In the field of computer graphics, an anisotropic surface will change in appearance as it is
rotated about its geometric normal, as is the case with velvet.
Anisotropic filtering (AF) is a method of enhancing the image quality of textures on surfaces
that are far away and steeply angled with respect to the point of view. Older techniques, such
as bilinear and tri linear filtering don't take account of the angle a surface is viewed from,
which can result in aliasing or blurring of textures. By reducing detail in one direction more
than another, these effects can be reduced.
Chemistry-
A chemical anisotropic filter, as used to filter particles, is a filter with increasingly smaller
interstitial spaces in the direction of filtration so that the proximal regions filter out larger
particles and distal regions increasingly remove smaller pales, resulting in greater flow-
through and more efficient filtration.
In NMR spectroscopy, the orientation of nuclei with respect to the applied magnetic field
determines their chemical shift. In this context, anisotropic systems refer to the electron
distribution of molecules with abnormally high electron density, like the pi system
of benzene. This abnormal electron density affects the applied magnetic field and causes the
observed chemical shift to change.
In fluorescence spectroscopy, the fluorescence anisotropy, calculated from
the polarization properties of fluorescence from samples excited with plane-polarized light, is
used, e.g., to determine the shape of a macromolecule. Anisotropy measurements reveal the
average angular displacement of the fluorophore that occurs between absorption and
subsequent emission of a photon.
Real-World Imagery-
Images of a gravity-bound or man-made environment are particularly anisotropic in the
orientation domain, with more image structure located at orientations parallel with or
orthogonal to the direction of gravity (vertical and horizontal).
Physics-
A plasma lamp displaying the nature of plasmas, in this case, the phenomenon of
"filamentation"
Cosmologists use the term to describe the uneven temperature distribution of the cosmic
microwave background radiation. There is evidence for a so-called "Axis of Evil" [1] in the
early Universe that is at odds with the currently favored theory of rapid expansion after
the Big Bang. Cosmic anisotropy has also been seen in the alignment of galaxies' rotation
axes and polarization angles of quasars.
Physicists use the term anisotropy to describe direction-dependent properties of
materials. Magnetic anisotropy, for example, may occur in aplasma, so that its magnetic field
is oriented in a preferred direction. Plasmas may also show "filamentation" (such as that seen
in lightning or aplasma globe) that is directional.
An anisotropic liquid is one which has the fluidity of a normal liquid, but has an average
structural order relative to each other along the molecular axis, unlike water or chloroform,
which contains no structural ordering of the molecules. Liquid crystals are examples of
anisotropic liquids.
Some materials conduct heat in a way that is isotropic, that is independent of spatial
orientation around the heat source. It is more common for heat conduction to be anisotropic,
which implies that detailed geometric modeling of typically diverse materials being
thermally managed is required. The materials used to transfer and reject heat from the heat
source in electronics are often anisotropic.
Many crystals are anisotropic to light ("optical anisotropy"), and exhibit properties such
as birefringence. Crystal optics describes light propagation in these media. An "axis of
anisotropy" is defined as the axis along which isotropy is broken (or an axis of symmetry,
such as normal to crystalline layers). Some materials can have multiple such optical axes.
Geophysics-
Seismic anisotropy is the variation of seismic wave speed with direction. Seismic anisotropy
is an indicator of long range order in a material, where features smaller than the
seismic wavelength (e.g., crystals, cracks, pores, layers or inclusions) have a dominant
alignment. This alignment leads to a directional variation of elasticity wave speed. Measuring
the effects of anisotropy in seismic data can provide important information about processes
and mineralogy in the Earth; indeed, significant seismic anisotropy has been detected in the
Earth's crust, mantle and inner core.
Geological formations with distinct layers of sedimentary material can exhibit electrical
anisotropy; electrical conductivity in one direction (e.g. parallel to a layer), is different from
that in another (e.g. perpendicular to a layer). This property is used in the gas and oil
exploration industry to identify hydrocarbon-bearing sands in sequences of sand and shale.
Sand-bearing hydrocarbon assets have high resistivity (low conductivity), whereas shale have
lower resistivity. Formation evaluation instruments measure this conductivity/resistivity and
the results are used to help find oil and gas wells.
The hydraulic conductivity of aquifers is often anisotropic for the same reason. When
calculating groundwater flow to drains or to wells, the difference between horizontal and
vertical permeability is to be taken into account otherwise the results may be subject to error.
Most common rock-forming minerals are anisotropic, including quartz and feldspar.
Anisotropy in minerals is most reliably seen in their optical properties. An example of an
isotropic mineral is garnet.
Medical Acoustics-
Anisotropy is also a well-known property in medical ultrasound imaging describing a
different resulting echogenicity of soft tissues, such as tendons, when the angle of the
transducer is changed. Tendon fibers appear hyper echoic (bright) when the transducer is
perpendicular to the tendon, but can appear hypo echoic (darker) when the transducer is
angled obliquely. This can be a source of interpretation error for inexperienced practitioners.
In diffusion tensor imaging, anisotropy alterations may reflect diffusion changes of water in
the brain, particularly in the white matter.
Material Science and Engineering-
Anisotropy, in Material Science, is a material’s directional dependence of a physical
property. Most materials exhibit anisotropic behavior. An example would be the dependence
of Young's modulus on the direction of load.[4] Anisotropy in polycrystalline materials can
also be due to certain texture patterns which are often produced during manufacturing of the
material. In the case of rolling, "stringers" of texture are produced in the direction of rolling,
which can lead to vastly different properties in the rolling and transverse directions. Some
materials, such as wood and fiber-reinforced composites are very anisotropic, being much
stronger along the grain/fiber than across it. Metals and alloys tend to be more isotropic,
though they can sometimes exhibit significant anisotropic behavior. This is especially
important in processes such as deep-drawing.
Wood is a naturally anisotropic material. Its properties vary widely when measured with the
growth grain or against it. For example, wood's strength and hardness will be different for the
same sample if measured in differing orientation.
Micro Fabrication-
Anisotropic etching techniques (such as deep reactive ion etching) are used in micro
fabrication processes to create well defined microscopic features with a high aspect ratio.
These features are commonly used in MEMS and micro fluidic devices, where the anisotropy
of the features is needed to impart desired optical, electrical, or physical properties to the
device. Anisotropic etching could also refer to certain chemical etchants which are etching a
certain material preferentially over certain crystallographic planes (e.g., KOH etching
of silicon [100] produces pyramid-like structures)
Neuroscience-
Diffusion tensor imaging is an MRI technique that involves measuring the fractional
anisotropy of the random motion (Brownian motion) of water molecules in the brain. Water
molecules located in fiber tracts are more likely to be anisotropic, since they are restricted in
their movement (they move more in the dimension parallel to the fiber tract rather than in the
two dimensions orthogonal to it), whereas water molecules dispersed in the rest of the brain
have less restricted movement and therefore display more isotropy. This difference in
fractional anisotropy is exploited to create a map of the fiber tracts in the brains of the
individual.
Anisotropic Filtering-
I've talked about Bilinear vs. Bicubic filtering before in the context of 2D images, but
bilinear filtering is a key ingredient in 3D graphics, too. When a texture is applied to a
polygon, the texture may be scaled up or down to fit, depending on your screen resolution.
This is done via bilinear filtering.
A full discussion of 3D graphics is way outside the scope of this post-- plus I don't want to
bore you to death with concepts like tri linear filtering and MIP-mapping. But I do want to
highlight one particular peculiarity of bitmap scaling in 3D graphics. As you rotate a
texture-mapped polygon away from the viewer, simple bilinear filtering and MIP-
mapping cause the texture to lose detail as the angle increases:
Now, some detail loss with distance is intentional. That's essentially what MIP-mapping is.
if we didn't MIP-map into the distance, the image would look extremely noisy:
No MIP-mapping MIP-mapping
The problem with simple MIP-mapping and bilinear filtering is that they're too simple. Much
more detail should be retained into the distance. And that's what anisotropic filtering does:
Because you're typically viewing most of the polygons in the world at an angle at any given
time, anisotropic filtering has a profound impact on image quality. Here are some
screenshots I took from the PC game Flat-out which illustrate the dramatic difference
between standard filtering and anisotropic filtering:
Standard filtering 16x Anisotropic filtering
These are detail elements cropped from the full-size 1024x768
screenshots: standard, anisotropic.
Proper anisotropic filtering is computationally expensive, even on dedicated 3D
hardware. And the performance penalty increases with resolution.
ATI was the first 3d hardware vendor to introduce some anisotropic filtering
optimizations-- some would say shortcuts-- in their cards which allowed much higher
performance. There is one small caveat, however: at some angles, textures don't get
fully filtered.
ATI effectively optimized for common angles you'd see in 3D level geometry (floor,
walls, ceiling) at the cost of the others.
For better or worse, these optimizations are now relatively standard now even on
video cards. I think it's a reasonable tradeoff for the increased image quality and
performance.
In my opinion, anisotropic filtering is the most important single image quality
setting available on today's 3D hardware.
CHAPTER 2
CHALLENGES IN ANISOTROPI FILTERING
MAJOR CHALLENGES
One problem with rendering 3d scenes is the interference of the texture-pixels (also
called texels) with the screen resolution. Since the textures have to be shrinked and
warped, depending on the distance and angle, this interference will cause heavy
shimmering of the textures if the viewer is moving. To avoid this shimmering, which
is almost unbearable, so called MIP maps are used.
MIP maps are variations of the same textures with lower resolutions. The farther
away a texture is to be rendered or the more acute the angle of the textured surface is,
the lower is the resolution of the used MIP map. This effectively eliminates the
texture shimmering.
Usually MIP maps are used together with a tri linear filter to avoid visible lines at the
junctions of two MIP map levels. For a tri linear filter, 8 Texel samples have to be
taken for each rendered pixel.
The problem with MIP maps is that the result is a washed out look of the textures,
especially if the textured surfaces have an acute angle (like right and left walls in a
hallway or the floor).
Anisotropic Filtering solves that problem by increasing the perceived resolution for
those textures without reintroducing the shimmering. To achieve this, even more
Texels have to be sampled for each pixel. The more samples are used, the better the
quality of the rendered picture is.
The quality of the Anisotropic Filtering is indicated by the factor of the additional
samples. 2x means that two times the samples are taken, resulting in 16 samples
altogether. A factor of 16 xs, which is the maximum current graphic cards can
manage, means that 128 Texel samples have to be taken for each pixel. It's obvious
that this causes a heavy performance hit.
With lower screen resolutions, the effect of Anisotropic Filtering is more pronounced,
but it's effect is visible in every resolution.
CHAPTER 3
PROPERTY OF ANISOTROPIC FILTERING
ADVANTAGES
The primary advantage of anisotropic filtering is a significant improvement in the
realism of 3D shapes when textures are applied. It is most commonly applied on
terrain mappings used in driving games and first-person shooters creating a greater
sense of immersion in the gaming or simulation environment by the end-user. The
technique does not find significant use in 3D applications that do not require an
extended feeling of “depth” by the end-user.
3D graphics uses anisotropic filtering to enhance image quality of textures on sloped,
rendered surfaces as referenced to the view of the 3D scene. The technique eliminates
polygon aliasing that is the cause of pixilated or jagged polygons in 3D graphics.
Additionally, the anisotropic filtering technique reduces blurring of sloped textures
and is the successor to previously used techniques such as bi-linear and tri-linear
filtering. It does not change the shape of the polygon that the texture is applied to
achieve these effects, however, as it only modifies the manner that textures are
mapped and subsequently displayed in the 3D scene view.
The RadeonAnisoFilter uses EXT_texture_filter_anisotropic to show the benefits of
anisotropic texture filtering.
Two quads are drawn from the same point of view. One uses anisotropic filtering and
the other do not. The runway texture map is from the old SGI Performer Demo which
discusses texture anisotropy.
RUNWAY CROSSHATCHED
DISADVANTAGES
One problem with rendering 3d scenes is the interference of the texture-pixels (also
called texels) with the screen resolution. Since the textures have to be shrinked and
warped, depending on the distance and angle, this interference will cause heavy
shimmering of the textures if the viewer is moving.
To avoid this shimmering, which is almost unbearable, so called MIP maps are used.
MIP maps are variations of the same textures with lower resolutions.
The farther away a texture is to be rendered or the more acute the angle of the
textured surface is the lower is the resolution of the used MIP map. This effectively
eliminates the texture shimmering. Usually MIP maps are used together with a tri
linear filter to avoid visible lines at the junctions of two MIP map levels. For a tri
linear filter, 8 Texel samples have to be taken for each rendered pixel.
The problem with MIP maps is that the result is a washed out look of the textures,
especially if the textured surfaces have an acute angle (like right and left walls in a
hallway or the floor).Anisotropic Filtering solves that problem by increasing the
perceived resolution for those textures without reintroducing the shimmering. To
achieve this, even more texels have to be sampled for each pixel. The more samples
are used, the better the quality of the rendered picture is.
The quality of the Anisotropic Filtering is indicated by the factor of the additional
samples. 2x means that two times the samples are taken, resulting in 16 samples
altogether.
A factor of 16x, which is the maximum current graphic cards can manage, means that
128 texel samples have to be taken for each pixel. It's obvious that this causes a heavy
performance hit.
CHAPTER 4
ANISOTROPIC FILTERING EXPLAINED
These new video cards absolutely scream in today's games, and will likely run all new
games at very high frame rates for some time to come. They are meant to run games
with more advanced features like anti-aliasing and anisotropic filtering enabled. We'll
touch on anti-aliasing in a bit, but for now let's discuss anisotropic filtering.
A full explanation of anisotropic filtering would be quite long and complicated. You
can find a bit about it in our earlier 3D Pipeline feature, but the basic gist of it is this:
it uses more texture samples to produce sharper, more accurate texturing.
In essence, it's like selectively using higher-detail MIP maps throughout the scene.
Both the GeForce 6800 and Radeon X800 cards offer 2X, 4X, 8X, and 16X
anisotropic filtering. 1X AF would be just like tri linear, with 8 texture samples per
pixel. So 2X AF is 16 samples per pixel, 4X is 32 samples, and so on up to 128
samples per pixel for 16X AF.
Reading in that much texture data for every single pixel would require massive
amounts of memory bandwidth, so both NVIDIA and ATI instead perform "adaptive"
anisotropic filtering.
If a surface is facing straight at the camera, it won't really benefit from anisotropic
filtering. 16X AF, with all its texture samples, won't really look different than plain tri
linear filtering. Therefore, both graphics card companies use an adaptive algorithm
that analyzes the angle and distance of each surface in a 3D scene and applies more or
less filtering as needed.
The result is that if you set the video card to perform 16X anisotropic filtering, you
get anywhere from 4 (bilinear) to 128 (16X aniso) filtering. What you get depends on
how badly the video card driver thinks it needs more texture samples. It's a trade-off
between visual quality and speed that ultimately benefits the consumer. If they tried
to perform a full 16X AF on every pixel, even today's super-fast $500 video cards
would slow to a crawl. In a way, the tri linear optimizations introduced in newer
video cards is just like this adaptive AF optimization, only less necessary, as the
performance difference is much smaller.
WORKING OF ANISOTROPIC FILTERING
Anisotropic filtering monitors a texture at the pixel-level and maps a pattern based on the
projected shape at each of the pixels. Depending on the view angle of a given 3D scene,
single pixels contain area used by more than a single pixel of texture information. As a result,
the anisotropic filtering method uses significant amounts of data dependent on the texture
quality, view angle, and slope of the shape the texture is applied. Even though texture
caching can reduce the amount of computer memory required for scenes using anisotropic
filtering, a number of graphics cards optimize this type of filtering for common angles found
in games such as the sky, floors, and walls.
NEED OF ANISOTROPIC FILTERING
Anisotropic filtering (AF) is used to address a specific kind of texture artifact that
occurs when a 3D surface is sloped relative to the view camera.
Before we drill too deeply, here's a working definition of the word itself. Isotropy
describes when an object's vectors are of equal value along its different axes – like a
square or a cube. For instance, bilinear and tri linear filtering are both isotropic
filtering techniques, since their filtering pattern is square.
Anisotropic filtering occurs when the filtering pattern exhibits different values along
different axes. AF uses a non-square, or an-isotropic filtering pattern, hence the name.
The pattern used by AF is typically rectangular, though it can at times be trapezoidal
or parallelogram-shaped.
A single screen pixel could encompass information from multiple texture elements
(texels) in one direction, such as the y-axis, and fewer in the x-axis, or vice-versa.
This requires a non-square texture filtering pattern in order to maintain proper
perspective and clarity in the screen image.
If more texture samples are not obtained in the direction or axis where an image or
texture surface is sloped into the distance (like a receding perspective view), the
applied texture can appear fuzzy or out of proportion. The problem worsens as the
angle of the surface relative to the view camera approaches 90 degrees, or on-edge.
To correct the problem, as mentioned, AF uses a rectangular, trapezoidal, or
parallelogram-shaped texture-sampling pattern whose length varies in proportion to
the orientation of the stretch effect. With AF, textures applied to the sloped surfaces
will not look as fuzzy to the viewer.
A classic example is a texture with text; recall the text at the beginning of every Star
Wars film that sets up the story. As the text scrolls off into the distance, its resolution
and legibility both tail off. Another example is in a billboard of a racing game, where
the text looks fuzzy and/or disproportionate without AF, and much clearer with AF
applied.
Anisotropic filtering is a difficult concept to convey in words, so we've included
some images provided by ATI and Nvidia to demonstrate the effects of AF. These
images clearly show differences in render quality with AF enabled, but own game
testing told a somewhat different story. For more insight on AF technology, you
might want to read this document.
CHAPTER 5
CLASSIFICATION OF ANISOTROPIC FILTERING
Anisotropic filtering (AF) is a graphic algorithm of improving the surface texture of an
object. Where anti-aliasing is a method of making the edges of an object smoother, the
anisotropic filtering is a method of enhancing the way how the objects looks inside, it
concerns to all spaces between the edges. Every 3D objects that are used to build a game
environment are textured. Texture is no more than a “coat of paint” that covers all those flat
polygons to make them look like skin, wood, metal or bricks in the wall. Anisotropic filtering
is a technology that became an standard effect in graphics cards in the early 90’s. Now the
anisotropic filtering is widely used in graphics hardware.
There are three types of anisotropic filtering (AF). We can differentiate three methods of AF
- bi-linear, tri-linear and full anisotropic. The anisotropic filtering is very powerful method,
but it also uses lots of GPU performance. Settings those are available for anisotropic filtering
are from 2x to 16x. When the level of anisotropic filtering is higher it provides more clear
and sharp texture details, but it effects with more GPU usage. Modern nVidia graphic card
such as Geforce 285 or even older models like Geforce 8800 handles 16x anizotropic
filtering in resolution 1280 x 900 with no problem. For Geforce 285 even 1680 x 1050 is not
a challenge, that because of use higher-quality AF algorithm. You can feel the difference
between Anizotropic Filtering ON and OFF. Just take a look how anizotropic filtering looks
like in action. These examples are a screenshots from 3DMark06 - the graphic benchmark
tool. You can open those examples in a new window or tab, switching between them you can
see the true difference.
In short, anisotropic filtering is used when the generated textures are away from the viewer.
It gives a smoother border between High resolution textures close to the viewer and lower
resolution textures which are used away from “your eyes”. It is very useful for the textures in
games with the far horizon. Of Course the textures resolution decreases looking further.
1. BILINEAR FILTERING
Bilinear filtering is a texture filtering method used to smooth textures when displayed larger
or smaller than they actually are.
Most of the time, when drawing a textured shape on the screen, the texture is not displayed
exactly as it is stored, without any distortion. Because of this, most pixels will end up
needing to use a point on the texture that's 'between' texels, assuming the texels are points (as
opposed to, say, squares) in the middle (or on the upper left corner, or anywhere else; it
doesn't matter, as long as it's consistent) of their respective 'cells'. Bilinear filtering uses these
points to perform bilinear interpolation between the four texels nearest to the point that the
pixel represents (in the middle or upper left of the pixel, usually).
The formula-
In these equations, uk and vk are the texture coordinates and yk is the color value at point k.
Values without a subscript refer to the pixel point; values with subscripts 0, 1, 2, and 3 refer
to the texel points, starting at the top left, reading right then down, that immediately surround
the pixel point. So y0 is the color of the texel at texture coordinate (u0, v0). These are linear
interpolation equations. We'd start with the bilinear equation, but since this is a special case
with some elegant results, it is easier to start from linear interpolation.
Assuming that the texture is a square bitmap,
Are all true. Further, define
With these we can simplify the interpolation equations:
And combine them:
Or, alternatively:
This is rather convenient.
However, if the image is merely scaled (and not rotated, sheared, put into perspective, or any
other manipulation), it can be considerably faster to use the separate equations and store
yb (and sometimes ya, if we are increasing the scale) for use in subsequent rows.
Limitations-
Bilinear filtering is rather accurate until the scaling of the texture gets below half or above
double the original size of the texture - that is, if the texture was 256 pixels in each direction,
scaling it to below 128 or above 512 pixels can make the texture look bad, because of
missing pixels or too much smoothness. Often, MIP mapping is used to provide a scaled-
down version of the texture for better performance; however, the transition between two
differently-sized MIP maps on a texture in perspective using bilinear filtering can be very
abrupt. Tri linear filtering, though somewhat more complex, can make this transition smooth
throughout.
2. TRI-LINEAR FILTERING
Tri-linear filtering is an extension of the bilinear texture filtering method, which also
performs linear interpolation between MIP maps.
Bilinear filtering has several weaknesses that make it an unattractive choice in many cases:
using it on a full-detail texture when scaling to a very small size causes accuracy problems
from missed texels, and compensating for this by using multiple MIP maps throughout the
polygon leads to abrupt changes in blurriness, which is most pronounced in polygons that are
steeply angled relative to the camera.
To solve this problem, tri-linear filtering interpolates between the results of bilinear filtering
on the two MIP maps nearest to the detail required for the polygon at the pixel. If the pixel
would take up 1/100 of the texture in one direction, tri-linear filtering would interpolate
between the result of filtering the 128*128 MIP map as y1 with x1 as 128, and the result of
filtering on the 64*64 MIP map as y2 with x2 as 64, and then interpolate to x = 100.
The first step in this process is of course to determine how big in terms of the texture the
pixel in question is. There are a few ways to do this, and the ones mentioned here are not
necessarily representative of all of them.
Use the distance along the texture between the current pixel and the pixel to its right (or
left, or above, or below) as the size of the pixel.
Use the smallest (or biggest, or average) of the various sizes determined by using the
above method.
Determine the uv-values of the corners of the pixel, use those to calculate the area of the
pixel, and figure out how many pixels of exactly the same size would take up the whole
texture.
Once this is done the rest becomes easy: perform bilinear filtering on the two mipmaps with
pixel sizes that are immediately larger and smaller than the calculated size of the pixel, and
then interpolate between them as normal.
Since it uses both larger and smaller MIP maps, tri-linear filtering cannot be used in places
where the pixel is smaller than a Texel on the original texture, because MIP maps larger than
the original texture are not defined. Fortunately bilinear filtering still works, and can be used
in these situations without worrying too much about abruptness because bilinear and tri-
linear filtering provide the same result when the pixel size is exactly the same as the size of a
Texel on the appropriate MIP map.
Tri-linear filtering still has weaknesses, because the pixel is still assumed to take up a square
area on the texture. In particular, when a texture is at a steep angle compared to the camera,
detail can be lost because the pixel actually takes up a narrow but long trapezoid: in the
narrow direction, the pixel is getting information from more texels than it actually covers (so
details are smeared), and in the long direction the pixel is getting information from fewer
texels than it actually covers (so details fall between pixels). To alleviate
this, anisotropic ("direction dependent") filtering can be used.
3. ANTI-ALIASING-
In digital signal processing, spatial anti-aliasing is the technique of minimizing the
distortion artifacts known as aliasing when representing a high-resolution image at a
lower resolution. Anti-aliasing is used in digital photography, computer graphics, digital
audio, and many other applications. Anti-aliasing means removing signal components
that have a higher frequency than is able to be properly resolved by the recording (or
sampling) device. This removal is done before (re)sampling at a lower resolution. When
sampling is performed without removing this part of the signal, it causes undesirable
artifacts such as the black-and-white noise near the top of figure 1-a below. In signal
acquisition and audio, anti-aliasing is often done using an analog anti-aliasing filter to
remove the out-of-band component of the input signal prior to sampling with analog-to-
digital converter. In digital photography, optical anti-aliasing filters are made
of birefringent materials, and smooth the signal in the spatial optical domain. The anti-
aliasing filter essentially blurs the image slightly in order to reduce resolution to below
the limit of the digital sensor (the larger the pixel pitch, the lower the achievable
resolution at the sensor level).
CHAPTER 6
APPLICATIONS OF ANISOTROIC FILTERING
Anisotropic filtering is primarily used in 3D games and simulations that require an
extended immersion in the depth of view by the end-user. In military simulations, it is
used to provide a more realistic view of terrain for flight and combat simulations. In
gaming, it is used to make racing and first person shooter games appear more realistic
and to add to the overall enjoyment of the games.
The use of the Anisotropic Filtering options is much the same as the Antialiasing
options described above. Just as with Antialiasing, there is a performance penalty for
each successively higher level of AF used.
For ATI cards, the only way to turn Off Anisotropic Filtering by default is to tick the
Application Preference box. There is no "0x" setting, so if the box is unticked then
Anisotropic Filtering is operating at the very least at 2x level.
This setting gives the application control over the Anisotropic Filter. If an application
has internal settings for the filtering, it's usually better to use the Application option
and set the preferred filter mode from within the game itself.
Some games slow down considerably, if you force an Anisotropic Filter through the
driver even though they offer an in-game setting for it.
Quality and Performance AF Modes
You may notice there is a choice of Quality or Performance AF modes available just above
the AF slider bar. If you choose Quality over Performance, the quality of the AF is
supposedly superior to that under the Performance mode, but at the cost of relatively reduced
performance. A close-up comparison of the two modes is provided below using a screenshot
from Knights of the Old Republic (see below):
As you can see, the performance and visual quality difference between the two modes is
indistinguishable (at least to me), but it's worth noting that the performance difference is
some 4fps, which is around 10%. Generally speaking, I have examined "Performance" vs.
"Quality" AF modes closely in various games and I cannot tell the difference between the
two. Note that throughout the rest of this guide I use "Quality" mode for all AF comparisons.
Finally, regardless of the brand of graphics card you can enable both Antialiasing and
Anisotropic Filtering at the same time, and at different levels (e.g. 0xAA and 4xAF, or 6xAA
and 2xAF) - neither setting will conflict with the other, although clearly the visual and
performance results will differ based on the combination you choose. These impacts are
examined in detail in the next section.
Image Quality & Performance Comparisons
There is no hard and fast rule for precisely how much of a performance impact the various
levels of AF and/or AA will bring with them, nor how much of a visual quality improvement
you will see. It all depends on the particular game's resolution, your graphics card, and the
complexity of settings and details in the game you are using.
To provide you with an indication of the type of image quality and performance impacts you
can expect with different levels of AA and AF, I have compiled comprehensive screenshots
of two recent games (one OpenGL and one Direct3D) running at progressively higher levels
of AA and AF, and combinations thereof. Note that your performance will vary depending on
how much more/less powerful your system is compared to the test system used.
CHAPTER 7
ANTIALIASING AND ANISOTROPIC FILTERING
Anisotropic filtering improves the clarity and crispness of textured objects in games.
Textures are images containing various types of data such as color, transparency,
reflectivity, and bumps (normals) that are mapped to an object and processed by the
GPU in order to give it a realistic appearance on your screen.
At its native dimensions, however, a typical texture is far too computationally
expensive to unconditionally reuse in a scene because the relative distance between
the texel (a pixel of a texture) of the object and the camera affects the observable
level of detail, which could easily translate to wasted processing time spent on
obtaining multiple texture samples that are applied to a disproportionately small
surface in the 3D scene.
To simultaneously preserve performance and image quality, MIP maps are used; MIP
maps are duplicates of a master texture that have been pre-rendered at lower
resolutions which a graphics engine can call when a corresponding surface is a
specific distance from the camera. With proper filtering, the use of multiple MIP map
levels in a scene can have no discernable impact on its appearance while greatly
optimizing performance.
Due to the dimensions of MIP maps conventionally being a power of two or smaller
than the original texture, there exist points where multiple MIP maps may be sampled
for a single texel which must be compensated for by the filtering method in use to
avoid blurring and other visual artifacts.
Bilinear filtering serves as the default, being the simplest and computationally
cheapest form of texture filtering available due to its simple approach to calculate a
texel's final color, four texel samples are taken from the mipmap defined by the
graphics engine at the approximate point where the target texel exists on-screen,
which will appear as a combined result of those samples' color data.
While this does account for distortions in texture angles, bilinear filtering takes
samples exclusively from the mipmap identified by the graphics engine, meaning that
any perspective-distorted texture lying at a point where two different mipmap sizes
are called results in the displayed texture containing pronounced shifts in clarity.
Trilinear filtering, the visually successive method to bilinear filtering, offers smooth
transitions between mipmaps by continuously sampling and interpolating (averaging)
texel data from the two closest mipmap sizes for the target texel, but this approach
along with bilinear filtering both assumethat the texture is displayed as square with
the camera and thus suffers from quality loss when a texture is viewed at a steep
angle.
This is due to the texel covering a depth longer than and a width more narrow than
the samples extracted from the mipmaps, resulting in blurriness from under- and
over-sampling respectively.
Anisotropic filtering exists to provide superior image quality in virtually all cases at
the slight expense of performance. By the computer science definition, anisotropy is
the quality of possessing dissimilar coordinate values in a space, which applies to any
texture not displayed as absolutely perpendicular to the camera.
As previously mentioned, bilinear and trilinear filtering suffer from resultant quality
loss when the sampled textures are oblique with the camera due to both methods
obtaining texel samples from mipmaps assuming that the mapped texel is perfectly
square in the rendered space, which is rarely true.
This quality loss is also related to the fact that mipmaps are isotropic, or possessing
identical dimensions, so when a texel is trapezoidal there is insufficient sampling in
both directions. To solve this, anisotropic filtering scales either the height or width of
a mipmap by a ratio relative to the perspective distortion of the texture; the ratio is
dependent on the maximum sampling value specified, followed by taking the
appropriate samples.
AF can function with anisotropy levels between 1 (no scaling) and 16, defining the
maximum degree which a mipmap can be scaled by, but AF is commonly offered to
the user in powers of two: 2x, 4x, 8x, and 16x.
The difference between these settings is the maximum angle that AF will filter the
texture by. For example: 4x will filter textures at angles twice as steep as 2x, but will
still apply standard 2x filtering to textures within the 2x range to optimize
performance. There are subjective diminishing returns with the use of higher AF
settings because the angles at which they are applied become exponentially rarer.
Anisotropic filtering can be controlled through the NVIDIA Control Panel within the
3D Settings section, however for the best performance and compatibility NVIDIA
recommends that users set this to be controlled by the application.
COCLUSION
We derived the decomposition of the anisotropic Gaussian in a 1-D Gauss filter in the
-direction followed by a 1-D filter in a non-orthogonal direction. The decomposition
is shown to be extremely efficient from a computing perspective. An implementation
scheme for normal convolution and for recursive filtering is proposed. Also directed
derivative filters are demonstrated.
We proposed a scheme for both anisotropic convolution filtering and anisotropic
recursive filtering. Convolution filtering is advantageous when considering locally
steered filtering, as is the case in tracking application.
Recursive filtering is more attractive when smoothing or differentiating the whole
image array, for example in feature detection.
Error due to interpolation is negligible compared to the error made by the recursive
approximation of the Gaussian filter, and compared to the truncation error for
convolution filters. The use of fast recursive filters result in an calculation time of 40
ms for a 512 512 input image on a current state-of-the-art PC.
Differentiation opposite to or along the filter direction is achieved by convolution
with a rotated sample difference filters. For practical applicability of orientation
scale-space analysis, we believe the exact approximation of Gaussian derivatives is of
less importance than the ability to compute results in limited time.
Although the decomposition of (1) is possible in higher dimensions, the method is
less beneficial for 3-D filtering applications. Only one of the axes can be chosen to be
aligned with the organization of the pixels in memory.
For the other directions, traversing in arbitrary directions through the pixel data is
required. Hence, computational gain is only marginal for higher dimensional
smoothing.
The proposed anisotropic Gaussian filtering method allows fast calculation of edge
and ridge maps, with high spatial and angular accuracy, improving computation speed
typically by a factor 3.
The anisotropic filters can be applied in cases where edge and ridge data is distorted.
Invariant feature extraction from a 2-D affine projection of a 3-D scene can be
achieved by tuning the anisotropic Gaussian filter, an important achievement for
computer vision.
When structures are inherently interrupted, as is the case for dashed line detection,
anisotropic Gaussian filter may accumulate evidence along the line while maintaining
spatial acuity perpendicular to the line.
Orientation scale-space analysis can best be based on anisotropic Gaussian filters
[16]. The proposed filtering method enables the practical applicability of orientation
scale-space analysis.
BIBLIOGRAPHY
www.google.com
www.wikipedia.com
www.w3school.com
www.wisegeek.com/what-is- anisotropic - filtering .htm
www.shinvision.com/57
www.webopedia.com/TERM/A/ Anisotropic _ Filtering .html
www.wisegeek.com/what-is- anisotropic - filtering .htm
REFERENCES
1. F. J. Canny, “A computational approach to edge detection,” IEEE Trans.
Pattern Anal. Machine Intell., vol. 8, no. 6, pp. 679–698, 1986.
2. C. Steger, “An unbiased detector of curvilinear structures,” IEEE Trans.
Pattern Anal. Machine Intell., vol. 20, pp. 113–125, 1998.
3. J. J. Koenderink, “The structure of images,” Biol. Cybern., pp. 363–370,
1984.
4. T. Lindeberg, Scale-Space Theory in Computer Vision. Norwell, MA:
Kluwer, 1994.
5. J. Bigün, G. H. Granlund, and J.Wiklund, “Multidimensional orientation
estimation with applications to texture analysis and optic flow,” IEEE
Trans. Pattern Anal. Machine Intell., vol. 13, pp. 775–790, 1991.
6. S. Kalitzin, B. ter Haar Romeny, and M. Viergever, “Invertible orientation
bundles on 2d scalar images,” in Scale-Space Theories in Computer
Vision: Springer-Verlag, 1997, pp. 77–88.
7. P. Perona, “Steerable-scalable kernels for edge detection and junction
analysis,” Image Vis. Comput., vol. 10, pp. 663–672, 1992.
8. J. J. Koenderink and A. J. van Doorn, “Receptive field families,” Biol.
Cybern., vol. 63, pp. 291–297, 1990.
9. W. T. Freeman and E. H. Adelson, “The design and use of steerable
filters,” IEEE Trans. Pattern Anal. Machine Intell., vol. 13, pp. 891–906,
1991.
10. R. Deriche, “Separable recursive filtering for efficient multi-scale edge
detection,” in Proc. Int. Workshop on Machine Vision and Machine Intelligence,
1987, pp. 18–23.
11. , “Fast algorithms for low-level vision,” IEEE Trans. Pattern Anal.
Machine Intell., vol. 12, pp. 78–87, 1990.
12. L. J. van Vliet, I. T. Young, and P. W. Verbeek, “Recursive Gaussian
derivative filters,” in Proc. ICPR ’98, 1998, pp. 509–514.
13. I.T. Young and L. J. van Vliet, “Recursive implementation of the
Gaussian filter,” Signal Process., vol. 44, pp. 139–151, 1995.
14. E. P. Simoncelli, “Distributed representation and analysis of visual motion,”
Ph.D. dissertation, Dept. Elect. Eng. Comput. Sci., Mass. Inst.
Technol., Cambridge, MA, 1993.
15. E. P. Simoncelli, E. H. Adelson, and D. J. Heeger, “Probability distributions
of optical flow,” in Proc. IEEE Int. Conf. Computer Vision and
Pattern Recognition, 1991, pp. 310–315.
16. M. van Ginkel, P.W. Verbeek, and L. J. van Vliet, “Improved orientation
Selectivity for orientation estimation,” in Proc. 10th Scandinavian Conf.