feature-based probabilistic texture blending with feature variations for terrains

11
COMPUTER ANIMATION AND VIRTUAL WORLDS Comp. Anim. Virtual Worlds 2012; 23:435–445 Published online 7 June 2012 in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/cav.1460 SPECIAL ISSUE PAPER Feature-based probabilistic texture blending with feature variations for terrains John Ferraris *, Feng Tian and Christos Gatzidis Bournemouth University, Poole, UK ABSTRACT The use of linear interpolation to blend different terrain types with distinct features produces translucency artefacts that can detract from the realism of the scene. The approach presented in this paper addresses the feature agnosticism of linear blending and makes the distinction between features (bricks, cobble stone, etc.) and non-features (cement, mortar, etc.). Using the blend weights from Bloom’s texture splatting, intermittent texture transitions are generated on the fly without the need for artistic intervention. Furthermore, feature shapes are modified dynamically to give the illusion of wear and tear, thus further reducing repetition and adding authenticity to the scene. The memory footprint is constant regardless of texture complexity and uses nearly eight times less texture memory when compared to tile-based texture mapping. The scalability and diversity of our approach can be tailored to a wide range of hardware and can utilize textures of any size and shape compared to the grid layout and memory limitations of tile-based texture mapping. Copyright © 2012 John Wiley & Sons, Ltd. KEYWORDS terrains; texturing; splatting; blending *Correspondence John Ferraris, Bournemouth University, Poole, UK. E-mail: [email protected] 1. INTRODUCTION The rendering of large outdoor environments has been the subject of much research over the decades. The ability to generate and texture terrains rapidly and without excessive artistic input is important when modelling landscapes for character animation. Contemporary rendering approaches have allowed for expansive terrains of even greater detail than before. This in turn has placed a burden on artists to generate large quantities of terrain assets for use in applications such as, for example, computer games. Procedural generation of such assets has been an area of interest because of its seemingly endless supply of var- ied content with little need for artistic input [1–3]. In this paper, we propose a novel approach on terrain texturing, which we call feature-based probabilistic texture blend- ing (FBPTB). The approach addresses the feature agnosti- cism of linear blending and makes the distinction between features and non-features. In fact, FBPTB can generate a Re-use of this article is permitted in accordance with the Terms and Conditions set out at http://wileyonlinelibrary.com/onlineopen #OnlineOpen_Terms near-endless number of transitional variations in real time without any artistic intervention. For the rest of the paper, we first review the related work on terrain texturing in Section 2. The overview of FBPTB is given in Section 3. We then describe the details of FBPTB in Section 4. Results are showcased in Section 5, followed by comparison with existing tech- niques in Section 6. Finally, we conclude our paper and propose a direction for future work in Section 7. 2. RELATED WORK Bloom [4] popularized an approach for texturing terrain meshes by linearly blending a small set of terrain textures according to blend weights. The linear blend however pro- duces artefacts where distinct features appear with varying degrees of translucency when the blend weight is less than 100%. Hardy and Mc Roberts [5] directly address these shortcomings with blend maps, and, although the technique is efficient, the translucency artefacts of distinct features are only postponed; they are still exhibited at transitions with low blend weights. The work of Zhang et al. [6] introduces the illusion of intermittent trail-off of features by generating a feature mask for the terrain, although close-up viewing exhibits the same artefacts of linear Copyright © 2012 John Wiley & Sons, Ltd. 435

Upload: bournemouth

Post on 08-Dec-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

COMPUTER ANIMATION AND VIRTUAL WORLDSComp. Anim. Virtual Worlds 2012; 23:435–445

Published online 7 June 2012 in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/cav.1460

SPECIAL ISSUE PAPER

Feature-based probabilistic texture blending withfeature variations for terrains†

John Ferraris*, Feng Tian and Christos Gatzidis

Bournemouth University, Poole, UK

ABSTRACT

The use of linear interpolation to blend different terrain types with distinct features produces translucency artefacts thatcan detract from the realism of the scene. The approach presented in this paper addresses the feature agnosticism of linearblending and makes the distinction between features (bricks, cobble stone, etc.) and non-features (cement, mortar, etc.).Using the blend weights from Bloom’s texture splatting, intermittent texture transitions are generated on the fly withoutthe need for artistic intervention. Furthermore, feature shapes are modified dynamically to give the illusion of wear andtear, thus further reducing repetition and adding authenticity to the scene. The memory footprint is constant regardless oftexture complexity and uses nearly eight times less texture memory when compared to tile-based texture mapping. Thescalability and diversity of our approach can be tailored to a wide range of hardware and can utilize textures of any size andshape compared to the grid layout and memory limitations of tile-based texture mapping. Copyright © 2012 John Wiley &Sons, Ltd.

KEYWORDS

terrains; texturing; splatting; blending

*Correspondence

John Ferraris, Bournemouth University, Poole, UK.E-mail: [email protected]

1. INTRODUCTION

The rendering of large outdoor environments has been thesubject of much research over the decades. The ability togenerate and texture terrains rapidly and without excessiveartistic input is important when modelling landscapes forcharacter animation. Contemporary rendering approacheshave allowed for expansive terrains of even greater detailthan before. This in turn has placed a burden on artiststo generate large quantities of terrain assets for use inapplications such as, for example, computer games.

Procedural generation of such assets has been an areaof interest because of its seemingly endless supply of var-ied content with little need for artistic input [1–3]. In thispaper, we propose a novel approach on terrain texturing,which we call feature-based probabilistic texture blend-ing (FBPTB). The approach addresses the feature agnosti-cism of linear blending and makes the distinction betweenfeatures and non-features. In fact, FBPTB can generate a

†Re-use of this article is permitted in accordance with the Terms

and Conditions set out at http://wileyonlinelibrary.com/onlineopen

#OnlineOpen_Terms

near-endless number of transitional variations in real timewithout any artistic intervention.

For the rest of the paper, we first review the relatedwork on terrain texturing in Section 2. The overviewof FBPTB is given in Section 3. We then describe thedetails of FBPTB in Section 4. Results are showcasedin Section 5, followed by comparison with existing tech-niques in Section 6. Finally, we conclude our paper andpropose a direction for future work in Section 7.

2. RELATED WORK

Bloom [4] popularized an approach for texturing terrainmeshes by linearly blending a small set of terrain texturesaccording to blend weights. The linear blend however pro-duces artefacts where distinct features appear with varyingdegrees of translucency when the blend weight is less than100%. Hardy and Mc Roberts [5] directly address theseshortcomings with blend maps, and, although the techniqueis efficient, the translucency artefacts of distinct featuresare only postponed; they are still exhibited at transitionswith low blend weights. The work of Zhang et al. [6]introduces the illusion of intermittent trail-off of featuresby generating a feature mask for the terrain, althoughclose-up viewing exhibits the same artefacts of linear

Copyright © 2012 John Wiley & Sons, Ltd. 435

Feature-based probabilistic texture blending J. Ferraris, F. Tian and C. Gatzidis

blending. Grundland et al. [7] propose a new blendingalgorithm that can preserve the colour, contrast or saliencewith the latter reducing the translucency artefacts in a simi-lar manner to Hardy and Mc Roberts, although the translu-cency artefacts would not be removed entirely. FBPTBremoves the translucency artefacts entirely by using a novelblending equation to ensure that distinct features are drawnwith full opacity or full translucency.

The use of Bloom blend weights contrasts with recentapproaches that stream a single gigantic texture that spansthe world in real time [8,9]. Although such approachesprovide the ultimate artistic control over the textured envi-ronment, they still require the texture content to be gener-ated beforehand. For procedural or dynamically generatedenvironments where artistic input is limited, FBPTB canproduce convincing terrain transitions on the fly withoutthe need for artistic intervention. We further break up therepetition of tiled textures by synthesizing the effects ofwear and tear in real time, intermittently giving featuresunique shapes and details.

Lai et al. [10] present an offline approach to generat-ing intermediately terrain textures to bridge the transitionsbetween different terrain types. Although the techniqueproduces aesthetically pleasing combinations from mini-mal user input, the amount of memory needed increaseswith texture resolution and the number of terrain types.The memory requirement of FBPTB is considerably loweras the only assets required are the texture itself and anaccompanying meta-texture that describes how the textureis composed.

Lefebvre and Neyret [11] propose a method for creatinghigh-resolution, varied procedural textures by specifyinga virtual texture that contains references to specific tex-ture patterns that are used to generate the final texture onthe fly. Neyret and Cani [12] built on this work by pro-ducing a wide variety of terrain transitions from a handfulof specific triangular textures. However, the technique islimited to abrupt transitions, which restrict the flexibil-ity of the approach. FBPTB breaks up the abruptness ofhard transitions by introducing stochaism above and belowthe transition.

Using Wang tiles for tile-based texturing [13] has poten-tial for generating a near-infinite number of texture tran-sitions, with Wei [14] evolving the idea further to takeadvantage of graphics processing units (GPU). Thisapproach, however, uses lookup tables that increase drasti-cally with texture complexity and terrain size to thepoint of being impractical for large terrain assets withnumerous, feature-rich textures. FBPTB does not rely onlookup tables and achieves a memory consumption of onlyone-eighth of the GPU-based Wang tiles whilst at thesame time maintaining the performance of lookup tableapproaches.

3. OVERVIEW OF FEATURE-BASEDPROBABILISTIC TEXTUREBLENDING

We define terrain features as the areas of a texture thatprotrude through underlying terrain types with full opacityrather than being linearly blended according to Bloom tex-ture weightings. The features include (but are not limitedto) bricks, cobble stones and other salient details, whereasnon-features are the areas of a texture that are not part ofa feature, such as (but not limited to) mortar and cement.Non-features lend themselves to linear blending becausethey neither protrude from the surface nor contain distinctvisual details. Figure 1 shows a sample texture (a) alongwith the isolated features (b).

FBPTB builds on the general concept introduced byFerraris et al. [15], which proposed a means of identify-ing distinct texture detail to be drawn with full opacity ortranslucency and doing so on a probabilistic basis. Figure 2illustrates an overview of the FBPTB process. More detailswill be given in the next section. The key to our approach isto ensure that all texels of a given feature receive the sameblend weight and thus are drawn (full opacity) or discarded(full translucency) together.

To deduce whether a feature is drawn, the blend weightsused in Bloom texture splatting are used as the probabil-ity of a given feature appearing. For each feature a random

(a) (b)

Figure 1. An example terrain texture (a) along with isolated features (b).

436 Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

J. Ferraris, F. Tian and C. Gatzidis Feature-based probabilistic texture blending

Figure 2. Outline of FBPTB.

Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

437

Feature-based probabilistic texture blending J. Ferraris, F. Tian and C. Gatzidis

number is generated. If that number lies under or equalsthe probability of the feature appearing, the feature will bedrawn with full opacity. Otherwise, the feature will be dis-carded, exposing underlying texture detail. If the texturesample is that of a non-feature, we perform a standardlinear blend using the Bloom blend weights.

4. PROBABILISTIC BLENDING

4.1. Meta-Texture

Each texture to be probabilistically blended has an accom-panying meta-texture that describes the parent/child rela-tionship of every texel in the texture and the feature(if any) they belong to. Each texel that lies within a fea-ture is considered a child texel of that feature. To ensurethat all child texel samples of a given feature receive thesame weight and seed, a single texel for each feature isnominated to be the parent texel whose weight and seedwill be shared between all other child texels of that feature.Any child texel may be nominated as the parent texel. Weused the centroid texel as the parent, considering it holdsthe average weight of the feature. The exception to this arefeatures which are split along the texture boundary (such

as the features along the perimeter of Figure 1). For thesefeatures, a parent texel is assigned to each split feature partand located on the boundary common to all parts (opposingboundaries are considered to be the same boundary).

Figure 3 illustrates the meta-texture generation process.The feature list contains the colour-coded list of featureswhere each feature receives a unique colour; thus, all childtexels of a given feature are of the same colour. Split fea-tures are considered separate features and thus receive theirown colour. Non-feature texels are coloured black. Thecentroid list is a black-and-white image where the centroidtexel for each feature is coloured white whereas all othertexels are coloured black.

The centroid position coordinates of the meta-textureare stored in base 256, where the red and green chan-nels hold the digits of the 0th column for the U and Vcoordinates, respectively (called the minor coordinates),whereas the blue channel stores two nibbles (packed intoa byte, U being the high nibble and V being the lownibble), with each nibble denominating the digits of thefirst column (called the major coordinates). This allowsthe centroid positions to be stored in three colour chan-nels rather than four, freeing up the alpha channel for thefeature mask.

Figure 3. Overview of the meta-texture generation (the major U and V coordinates image has been exaggerated for clarity).

438 Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

J. Ferraris, F. Tian and C. Gatzidis Feature-based probabilistic texture blending

To construct the meta-texture, the centroid list is parsedto gather the centroid positions of all features in the colour-coded feature list. The feature list is then parsed to popu-late the meta-texture. For black texels, the position of thetexel being read is encoded in the corresponding texel ofthe meta-texture. For non-black feature texels, the centroidposition for the associated feature is instead encoded.

4.2. Centroid Position

The meta-texture is sampled using the texture coordinatesEuv to obtain the centroid position of the sampled texel,

giving the minor coordinate vector Emin and the majorcoordinate vector Emaj , as unpacked from the nibbles intothe range Œ0; 255�. The centroid vector Ec is obtained byadding the vectors Emin and Emaj and expressing them asdecimal fractions of the meta-texture dimensions in therange of Œ0; 1�.

4.3. Weight/Seed Texture Lookup

The blend weights and seeds are stored in a texture of thesame dimensions as the terrain mesh where the coordinatesEws used to perform the weight/seed texture lookup are

calculated by truncating the vector Euv to obtain the inte-ger components and adding them to the centroid position.The range of Ews is reduced to Œ0; 1� proportional to theweight/seed texture dimensions. Sampling the weight/seedtexture with the newly transformed Ews coordinates yieldsthe weight value p (which is also the probability) and theseed value s from the red and blue channels, respectively.

4.4. Weighting Coefficients

A set of weighting coefficients can be introduced to theweighting equation to further shape and control the blend-ing of feature and non-feature texture samples when p �1:0. These coefficients are stored in the vector ECmf n andrefer to the feature map, feature texel and non-feature texelcoefficients, respectively. Coefficients greater than 1:0 sus-tain a given parameter, whereas coefficients in the rangeŒ0; 1� dampen a given parameter (values of 1:0 leave theparameter untouched).

The feature map is a blurred version of the meta-texture’s feature mask stored in the alpha channel of thetexture itself and is used within the blending equation totaper the perimeter of features from full opacity to slighttranslucency, causing the edges of features to be smoothedrather than sharp. The sampled feature map coefficient ECmis used to smoothen the edges of features when viewedup close.

The feature coefficient ECf and non-feature coefficientECn are used to sustain or dampen the feature and non-features. These optional coefficients are stored in theweight/seed texture at each vertex to offer finer control over

how little or much features and non-features appear on theterrain mesh.

4.5. Weighting Equation

The weighting equation below yields the blend weight wand uses the variables f , d , r and p, all within the rangeŒ0; 1�. f is the optional feature map value as sampled fromthe texture’s alpha channel, d is the feature mask as sam-pled from the meta-texture’s alpha channel and r is therandom value obtained by sampling the noise texture usingthe seed value s as input. Areas of the terrain with a 0%blend weight should always fail the probability test; thus,the noise texture should contain values in the range Œ0; 1�.

w D sgn

�b1

rdp ECf c

�ECm„ ƒ‚ …

feature texels

Cp.1:0� d/ ECn„ ƒ‚ …non-feature texels

(1)

The first part of the weighting equation accommodatesfeature texels and returns a value of either 0:0 or 1:0. If theprobability or feature mask value is zero or if the randomvalue is greater than the probability, the signum functionsgn returns 0:0, causing the first part of the equation to null,allowing the second part to generate a non-feature weight.

The second part of the equation accommodates non-feature texels. If the feature mask value d is 1:0 (a featuretexel), this side of the equation will null, allowing the left-hand side to generate a feature weight. For non-featuretexels, the Bloom weight p is used to perform a linearblend in the range of Œ0; 1� with the underlying texture.

4.6. Correcting Minification andAnisotropic Distortion

As features are drawn or discarded dynamically, minifica-tion and anisotropic distortion artefacts can be exhibitedwhen sections of the terrain are viewed at a distance or atoblique angles. The intensity of the aliasing depends on thetype of texture and mesh; thus, the implementation of thisstep should be assessed on a case-by-case basis. For exam-ple, terrain meshes featuring large, flat planes will need toaddress the oblique artefacts, whereas textures with highfrequencies of features and large draw distances will needto address the distance artefacts.

To solve these problems, we interpolate between theresults of FBPTB and a standard linear blend. For obliqueanti-aliasing, the y component of the view vector ismultiplied by a coefficient, whereupon it is used as theinterpolation value between the FBPTB and linear blend.We found that a value of 0:8 for the coefficient yieldedsatisfactory results. For minification anti-aliasing, the inter-polation is performed using the interpolation value derivedfrom the distance between the fragment position and viewspace origin.

Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

439

Feature-based probabilistic texture blending J. Ferraris, F. Tian and C. Gatzidis

4.7. Blending Equation

The previous work of Ferraris and Gatzidis [16] andFerraris et al. [17] blended multiple textures in orderof precedence, such that lower precedence textures weremasked by higher precedence textures of a higher blendweight. With FBPTB, feature texels of lower precedencetextures always take priority over higher precedence non-feature texels. This ensures that features will always be vis-ible even for lower precedence textures. Features for higherprecedence textures are blended in priority over featuresfor lower precedence textures.

Equation (2) keeps track of whether any features havebeen drawn for precedence levels below the level that isbeing blended. A flag F is initialized to 0:0 and then setto 1:0 whenever a feature texel is blended. As blendingEquation (3) takes place from precedence level 2 andupwards (as when mixing from level 1, level n� 1 is non-existent), the feature flag is initially set to d1 (the featuremask value of the lowest precedent texture). Subsequentprecedent levels use the following equation to keep trackof whether any features reside in preceding levels, wheredn is the feature mask value of the texture at precedentlevel n.

F D sgn.F C dn/ (2)

Once the feature flag has been calculated, blendingEquation (3) is executed for each precedence level to dic-tate how much of said precedence level’s texture is blendedwith the previous. The final blend weight b for the textureat precedence level n is calculated as follows, where wnis the weight for texture at precedence level n (as calcu-lated by the weighting equation) and H is the Heavisidestep function.

bn DH.dn �F /wn (3)

4.8. Feature Variations

Feature variations are used to dynamically introduceunique wear and tear to texture features when the prob-ability of appearing lies below 100%. Executed prior tothe weighting equation, they are achieved by using theseed and an additional random number at each vertex tosample the grayscale variation map (detailing variouscracks, divots and holes) at a random point. This randomsample is then used to modulate a given texture to darkenthe colour. Furthermore, by modulating the texture’s nor-mal map with the variation normal map (generated fromthe variation map) and nullifying d when the variationmap lies under a certain threshold, holes can be createdand chunks removed, exposing any underlying textureinformation. Figure 4d illustrates such feature variations.

(a)

(b)

(c)

(d)

Figure 4. Blending comparison of (a) linear blending, (b) blendmaps, (c) tile-based texturing and (d) FBPTB.

5. RESULTS

In this section, we will compare FBPTB with linear blend-ing (used by Bloom [4]), blend maps [5] and tile-basedtexture mapping [14,18] (where applicable).

Figure 4 is an example of a tile terrain texture blendingfrom right (100%) to left (0% tile). With both FBPTB andtile-based blending, the tiles trail off intermittently ratherthan uniformly, whereas FBPTB introduces deteriorationin the form of cracks, chipped tiles and scratches, breakingup the uniformity of the transition. The blend map deliversa more convincing result than linear blending, with the dirtappearing in the gaps between the tiles, although comparedwith FBPTB the uniform nature of the blend would resultin obvious texture repetition when blended over a largearea of the terrain mesh. Note that the tile-based blends arerestricted to such grid-like textures, whereas FBPTB canbe used with any feature layout and also produces random

440 Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

J. Ferraris, F. Tian and C. Gatzidis Feature-based probabilistic texture blending

(a)

(b)

(c)

Figure 5. Blending comparison of (a) linear blending, (b) blendmaps and (c) FBPTB.

variations that increase in frequency and prominence as theblend weight reduces.

The lookup table for the tile-based texturing was gen-erated manually using the results from the FBPTB blendin order to determine which features were drawn and dis-carded. We found that the limitations of the grid layoutand labour intensity of manually populating the lookuptables significantly impacted the workflow and flexibilityof results. FBPTB suffers from none of these limitations asthe intermittent pattern of features is generated automati-cally and our approach can be used with any textures withsalient detail.

As tile-based texture mapping obtains the lookup tableaddress implicitly from the fragment texture coordi-nates, the technique is restricted to features that are laidout uniformly on a grid, as shown in Figure 4d. Inthe following examples, we will not compare it withour approach.

Figure 5 illustrates a particular terrain transition thatcannot be reproduced properly using existing blendingapproaches. Both the tile and mosaic textures blend from100% (right) to 0% (left). Whereas FBPTB can achievethis complex blend in a convincing manner with no arte-facts, the blend map and linear blend simply cannot rep-resent such a transition as the two textures cannot bedistinguished from one and the other, giving FBPTB asignificant advantage.

Figure 6 shows close-up shots of a top-to-bottom blendwith parallax mapping enabled. For this, we used a cob-ble stone texture blended with a grass underlay. We chose

this particular texture because it represented a hypothetical‘worst case’ insofar as having features of unique shapesand sizes laid out in an irregular manner. For linear blend-ing, the mixture of texture and underlay at a mid-point ofthe blend transition produces an artificial result with heavytranslucency artefacts as the cobble and grass cannot bedistinguished from each other. The illusion of relief theparallax effect should be producing is undermined by theseheavy blending artefacts. The blend maps still suffer fromthe translucency artefacts (albeit to a lesser degree) as thetwo textures can be distinguished, but, like linear blend-ing, the parallax effect is still lost when the artefacts are attheir most prominent as the blend weight approaches closerto 0%.

For FBPTB, none of these translucency artefacts are dis-played, resulting in a far more convincing blend. Here,even at the mid-point of the transition, the stochasticnature of the probability blend breaks up the banding arte-facts that are exhibited by linear and blend maps as theyblend from top to bottom. Furthermore, the features wherechunks have been removed from the corners and sides illus-trate how convincing the feature variation process is whencombined with parallax mapping, especially when con-sidering the fact that all of the variations were generateddynamically with no artistic input.

Figure 7 depicts a terrain with three patches of cobbleblended with a 50/50% mix of grass and cobble using lin-ear blending, blend maps and FBPTB. For linear blending,the 50% mix of grass gives the cobbles an artificially dullappearance, and the translucency artefacts become morepronounced as the blend weight drops off from the cen-tre towards the perimeter of the patch. The blend maps donot suffer from the dull appearance, although the shapeof the circular brush used to paint the patch is revealedat the perimeter. This could be fixed by having the artistmanually touch up the periphery of the patch to intro-duce irregularity but in practice this will be limited bytime and mesh resolution. FBPTB breaks up the uni-form shape of the brush automatically and can be furtherfine-tuned using the weighting coefficients (either globallyor per vertex).

6. PERFORMANCE ANALYSIS

We fabricated a ‘worst case’ scenario by blending twotextures using our approach and the existing alternativesalong with a base texture across an entire mesh that wasrendered with no optimizations. For tile-based texture map-ping, we extended the approach described by Wei [14] todraw or discard feature texels together in the same manneras FBPTB. The weighting algorithm was a simplified ver-sion of Equation (1) as illustrated below, where m is thebinary result of sampling the lookup table:

w D sgn.md/max.p; f /C p.1:0� d/ (4)

The terrain mesh consisted of 512 quads (513 � 513vertices) with a texture scale of 1:0. The texture sets

Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

441

Feature-based probabilistic texture blending J. Ferraris, F. Tian and C. Gatzidis

(a) (b)

(c) (d)

(e) (f)

Figure 6. Close-up parallax shots of (a and b) linear blending, (c and d) blend maps and (e and f) FBPTB.

used were the lowest common denominator that the testedapproaches could support (as discussed in Section 5). Thetwo textures blended algorithmically measured 8 � 8 and10 � 10 in features, respectively, whereas the base texturewas mixed in with a standard linear blend. Three con-figurations were tested: a straight blend (no lighting orparallax effects), a normal mapped blend and a parallaxblend. The viewport was filled entirely with blendedfragments and the hardware used was a Radeon Mobility4600 Series GPU in a Dual Core 1.5 GHz CPU laptop with3 GB of RAM.

Table I details the results of the performance tests.The relative performance of the approaches is in syncwith their relative complexity. Once normal and parallaxmapping were enabled, the extra overhead of these tech-niques introduced reduced the relative difference in per-formance between the approaches. The four key bene-fits FBPTB offers over tile-based texture mapping are asfollows: (i) the considerably lower video memory over-head (the texture usage of tile-based texturing is nearlyeight times more); (ii) the complete automation (comparedwith manually populating lookup tables with feature data);

442 Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

J. Ferraris, F. Tian and C. Gatzidis Feature-based probabilistic texture blending

Figure 7. A terrain shot with examples of a 50/50 mix of cobble and grass using (left) linear blending, (middle) blend maps and (right)FBPTB.

Table I. Performance comparison of different blending approaches.

Straight blend Normal mapping Normal + parallax mapping

Algorithm FPS Ms� Memory usage� FPS Ms� Memory usage� FPS Ms� Texture usage�

No texturing 385 0.156 n/a n/a n/a n/a n/a n/a n/aBase texture only 313 0.192 1,024 n/a n/a n/a n/a n/a n/aLinear blending� 264 0.227 3,586 215 0.279 5,634 203 0.296 5,634Blend maps� 263 0.228 3,586 214 0.280 5,634 199 0.302 5,634Tile-based texturing� 235 0.255 45,570 137 0.438 47,618 135 0.444 47,618FBPTB� 226 0.265 5,891 135 0.444 7,939 134 0.447 7,939FBPTB with variations� 219 0.274 7,172 127 0.472 9,220 125 0.48 9,220

� Frame time (in milliseconds).� Total texture memory usage (in kB).� Straight blend between two terrain textures and a base texture with no other effects.

(iii) the scalability and diversity; and (iv) more importantly,the fact that FBPTB can utilize textures of any size andshape compared to the grid layout and memory limitationsof tile-based texture mapping.

7. CONCLUSION AND FUTUREWORKS

We have proposed a novel approach in this paper tointroduce intermittency and irregularity at transitions forterrain types that have distinct features. Our approachcompletely removes the translucency artefacts that existin traditional Bloom texture mapping and can generate anear-endless number of transitional variations in real timewithout any artistic intervention. Compared to tile-basedtexture mapping, FBPTB uses considerably less memoryto store texture data. Furthermore, our approach can han-dle any number of features at a constant overhead in termsof memory usage and algorithmic operations.

Currently, FBPTB only works with textures that con-tain salient details. Future work will involve expanding thetechnique to work with textures that do not contain dis-tinct feature information, such as grass, mud and sand.Instead of using static feature masks, an elaboration of thefeature variations aspect of our approach will be exploredto generate unique shapes in real time in order to deliversplatters, clumps and pockets of non-feature textures atterrain transitions.

REFERENCES

1. Gain J, Marais P, Strasser W. Terrain sketching. InProceedings of the 2009 Symposium on Interactive 3DGraphics and Games, I3D ’09. ACM, New York, NY,USA, 2009; 31–38.

2. Teoh ST. Riverland: an efficient procedural model-ing system for creating realistic-looking terrains. InProceedings of the 5th International Symposium on

Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

443

Feature-based probabilistic texture blending J. Ferraris, F. Tian and C. Gatzidis

Advances in Visual Computing: Part I, ISVC ’09.Springer-Verlag, Berlin, Heidelberg, 2009; 468–479.

3. Smelik R, Tutenel T, de Kraker KJ, Bidarra R. Integrat-ing procedural generation and manual editing of virtualworlds. In Proceedings of the 2010 Workshop on Pro-cedural Content Generation in Games, PCGames ’10.ACM, New York, NY, USA, 2010; 2:1–2:8.

4. Bloom C. Terrain texture compositing by blendingin the frame-buffer, November 2000. Available from:http://goo.gl/xDALP.

5. Hardy A, Mc Roberts DAK. Blend maps: enhancedterrain texturing. In Proceedings of the 2006 AnnualResearch Conference of the South African Institute ofComputer Scientists and Information Technologists onIT Research in Developing Countries, SAICSIT ’06.South African Institute for Computer Scientists andInformation Technologists, Republic of South Africa,2006; 61–70.

6. Zhang H, Ouyang D, Lin H, Guan W. Texture synthe-sis based on terrain feature recognition. In Proceedingsof the 2008 International Conference on ComputerScience and Software Engineering—Volume 02. IEEEComputer Society, Washington, DC, USA, 2008;1158–1161.

7. Grundland M, Vohra R, Williams GP, Dodgson NA.Cross dissolve without cross fade: preserving contrast,color and salience in image compositing. ComputerGraphics Forum 2006; 25(3): 577–586.

8. Mittring M, GmbH C. Advanced virtual texture topics.In ACM SIGGRAPH 2008 Classes, SIGGRAPH ’08.ACM, New York, NY, USA, 2008; 23–51.

9. van Waveren J. id Tech 5 challenges: from texture vir-tualization to massive parallelization. In ACM AnnualSIGGRAPH Conference 2009: Beyond ProgrammableShading, SIGGRAPH ’09. ACM, New York, NY,USA, 2009.

10. Lai Y-Y, Tai W-K, Chang C-C, Liu C-D. Synthesizingtransition textures on succession patterns. In Proceed-ings of the 3rd International Conference on ComputerGraphics and Interactive Techniques in Australasiaand South East Asia, GRAPHITE ’05. ACM,New York, NY, USA, 2005; 273–276.

11. Lefebvre S, Neyret F. Pattern based procedural tex-tures. In Proceedings of the 2003 Symposium on Inter-active 3D Graphics, I3D ’03. ACM, New York, NY,USA, 2003; 203–212.

12. Neyret F, Cani M-P. Pattern-based texturing revis-ited. In Proceedings of the 26th Annual Conferenceon Computer Graphics and Interactive Techniques,SIGGRAPH ’99. ACM Press/Addison-Wesley Pub-lishing Co., New York, NY, USA, 1999; 235–242.

13. Moore C, Rapaport I, Rémila E. Tiling groups forWang tiles. In Proceedings of the Thirteenth Annual

ACM-SIAM Symposium on Discrete Algorithms,SODA ’02. Society for Industrial and Applied Math-ematics, Philadelphia, PA, USA, 2002; 402–411.

14. Wei L-Y. Tile-based texture mapping on graph-ics hardware. In ACM SIGGRAPH 2004 Sketches,SIGGRAPH ’04. ACM, New York, NY, USA,2004; 55–63.

15. Ferraris J, Tian F, Gatzidis C. Feature-based prob-ability blending. In ACM SIGGRAPH Asia 2010Posters, SA ’10. ACM, New York, NY, USA, 2010;51:1–51:1.

16. Ferraris J, Gatzidis C. A rule-based approach to 3Dterrain generation via texture splatting. In Proceed-ings of the International Conference on Advances inComputer Entertainment Technology, ACE ’09. ACM,New York, NY, USA, 2009; 407–408.

17. Ferraris J, Gatzidis C, Tian F. Automating terraintexturing in real-time using a rule-based approach.The International Journal of Virtual Worlds 2010;9(4): 21–28.

18. Cohen MF, Shade J, Hiller S, Deussen O. Wang tilesfor image and texture generation. In ACM SIGGRAPH2003 Papers, SIGGRAPH ’03. ACM, New York, NY,USA, 2003; 287–294.

AUTHORS’ BIOGRAPHIES

John Ferraris is a PhD researcher atthe School Of Design, Engineering &Computing in Bournemouth Univer-sity (BU), UK. He received his BSc(Hons) in Computing at BournemouthUniversity in 2009. His research inter-ests include real-time 3D graphics,terrains, lighting and texturing.

Feng Tian is an Associate Profes-sor in the School of Design, Engi-neering and Computing (DEC) atBournemouth University, UK. Hisresearch focuses on Computer Graph-ics, Computer Animation, NPR, etc.He has published over 50 papers inpeer-reviewed international journals

and conferences. Prior to joining Bournemouth Univer-sity, he was an Assistant Professor in the School ofComputer Engineering, Nanyang Technological University(NTU), Singapore.

444 Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

J. Ferraris, F. Tian and C. Gatzidis Feature-based probabilistic texture blending

Christos Gatzidis is a SeniorLecturer in Creative Technology atBournemouth University, UK. Addi-tionally, he is a Visiting ResearchFellow at the School Of Informatics,Department of Information Science,City University London, where hecompleted his PhD, titled ‘Evaluating

non-photorealistic rendering for 3D urban models in thecontext of mobile navigation’. Furthermore, he has aMasters in Arts in Computer Animation from TeessideUniversity and a BSc in Computer Studies (Visualisation)from the University of Derby. He has contributed to severalrefereed conference, book and journal publications and isalso a member of the advisory boards of three journals plusvarious international program conference committees.

Comp. Anim. Virtual Worlds 2012; 23:435–445 © 2012 John Wiley & Sons, Ltd.DOI: 10.1002/cav

445