applicabilityofasingledepthsensorinreal-time3dclothes...

10
Research Article Applicability of a Single Depth Sensor in Real-Time 3D Clothes Simulation: Augmented Reality Virtual Dressing Room Using Kinect Sensor Sasadara B. Adikari , 1 Naleen C. Ganegoda, 2 Ravinda G. N. Meegama, 3 and Indika L. Wanniarachchi 1 1 Department of Physics, University of Sri Jayewardenepura, Nugegoda, Sri Lanka 2 Department of Mathematics, University of Sri Jayewardenepura, Nugegoda, Sri Lanka 3 Department of Computer Science, University of Sri Jayewardenepura, Nugegoda, Sri Lanka Correspondence should be addressed to Indika L. Wanniarachchi; [email protected] Received 14 October 2019; Revised 21 March 2020; Accepted 29 April 2020; Published 18 May 2020 Academic Editor: Alessandra Agostini Copyright © 2020 Sasadara B. Adikari et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A busy lifestyle led people to buy readymade clothes from retail stores with or without fit-on, expecting a perfect match. e existing online cloth shopping systems are capable of providing only 2D images of the clothes, which does not lead to a perfect match for the individual user. To overcome this problem, the apparel industry conducts many studies to reduce the time gap between cloth selection and final purchase by introducing “virtual dressing rooms.” is paper discusses the design and implementation of augmented reality “virtual dressing room” for real-time simulation of 3D clothes. e system is developed using a single Microsoft Kinect V2 sensor as the depth sensor, to obtain user body parameter measurements, including 3D measurements such as the circumferences of chest, waist, hip, thigh, and knee to develop a unique model for each user. e size category of the clothes is chosen based on the measurements of each customer. e Unity3D game engine was incorporated for overlaying 3D clothes virtually on the user in real time. e system is also equipped with gender identification and gesture controllers to select the cloth. e developed application successfully augmented the selected dress model with physics motions according to the physical movements made by the user, which provides a realistic fitting experience. e performance evaluation reveals that a single depth sensor can be applied in the real-time simulation of 3D cloth with less than 10% of the average measurement error. 1.Introduction At present, physically try-on clothes are difficult, because it is a time-consuming process. Even with a store assistant, it is difficult for a customer to find the best matching clothes. On the vendor’s side, it is difficult and time consuming to repack all misplaced clothes that were taken out by the customers. In online shopping systems, it is virtually impossible to find a suitable design just by looking at a few 2D images of a cloth. at is where virtual dressing rooms come into play where it enables customers to try-on different cloths virtually in front of a large screen [1, 2]. is solution enables customers to choose the best-matched cloths within a shorter time, with a stimulating new experience. A research work that is used to increase online and off-line exploratory behavior, patronage and purchase intentions that were published in 2018 [3] shows that the presence of a virtual fitting room (VFR) on a website increases customers’ curiosity about the product and intension towards purchasing the product. is research concluded that having a virtual dressing room supported customers to buy clothes with enhanced satisfaction. Today, virtual reality and mixed reality play a huge role in overcoming many existing problems in day-to-day life. In such applications, the depth of information plays a vital role. Hindawi Advances in Human-Computer Interaction Volume 2020, Article ID 1314598, 10 pages https://doi.org/10.1155/2020/1314598

Upload: others

Post on 19-Aug-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

Research ArticleApplicability of a Single Depth Sensor in Real-Time 3D ClothesSimulation Augmented Reality Virtual Dressing Room UsingKinect Sensor

Sasadara B Adikari 1 Naleen C Ganegoda2 Ravinda G N Meegama3

and Indika L Wanniarachchi 1

1Department of Physics University of Sri Jayewardenepura Nugegoda Sri Lanka2Department of Mathematics University of Sri Jayewardenepura Nugegoda Sri Lanka3Department of Computer Science University of Sri Jayewardenepura Nugegoda Sri Lanka

Correspondence should be addressed to Indika L Wanniarachchi iwannisjpaclk

Received 14 October 2019 Revised 21 March 2020 Accepted 29 April 2020 Published 18 May 2020

Academic Editor Alessandra Agostini

Copyright copy 2020 Sasadara B Adikari et al is is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

A busy lifestyle led people to buy readymade clothes from retail stores with or without fit-on expecting a perfect match eexisting online cloth shopping systems are capable of providing only 2D images of the clothes which does not lead to a perfectmatch for the individual user To overcome this problem the apparel industry conducts many studies to reduce the time gapbetween cloth selection and final purchase by introducing ldquovirtual dressing roomsrdquo is paper discusses the design andimplementation of augmented reality ldquovirtual dressing roomrdquo for real-time simulation of 3D clothes e system is developedusing a single Microsoft Kinect V2 sensor as the depth sensor to obtain user body parameter measurements including 3Dmeasurements such as the circumferences of chest waist hip thigh and knee to develop a unique model for each user e sizecategory of the clothes is chosen based on the measurements of each customer e Unity3D game engine was incorporated foroverlaying 3D clothes virtually on the user in real time e system is also equipped with gender identification and gesturecontrollers to select the cloth e developed application successfully augmented the selected dress model with physics motionsaccording to the physical movements made by the user which provides a realistic fitting experience e performance evaluationreveals that a single depth sensor can be applied in the real-time simulation of 3D cloth with less than 10 of the averagemeasurement error

1 Introduction

At present physically try-on clothes are difficult because it isa time-consuming process Even with a store assistant it isdifficult for a customer to find the best matching clothes Onthe vendorrsquos side it is difficult and time consuming to repackall misplaced clothes that were taken out by the customersIn online shopping systems it is virtually impossible to find asuitable design just by looking at a few 2D images of a clothat is where virtual dressing rooms come into play where itenables customers to try-on different cloths virtually in frontof a large screen [1 2] is solution enables customers to

choose the best-matched cloths within a shorter time with astimulating new experience A research work that is used toincrease online and off-line exploratory behavior patronageand purchase intentions that were published in 2018 [3]shows that the presence of a virtual fitting room (VFR) on awebsite increases customersrsquo curiosity about the product andintension towards purchasing the product is researchconcluded that having a virtual dressing room supportedcustomers to buy clothes with enhanced satisfaction

Today virtual reality and mixed reality play a huge rolein overcoming many existing problems in day-to-day life Insuch applications the depth of information plays a vital role

HindawiAdvances in Human-Computer InteractionVolume 2020 Article ID 1314598 10 pageshttpsdoiorg10115520201314598

Hence RGB-D sensors are widely used for developmentsRobot navigation [4 5] home automation [6] and virtuallaboratories for education [7 8] are some of the researchstudies published recently where RGB-D sensors are used forthe development

In this proposed work our main goal is to implement apractical solution that can provide greater satisfaction iscan be achieved by implementing an augmented realityexperience in the most realistic way including physicsmotions to the dress according to the movements made bythe customer using a single depth sensor e proposedsystem is developed with the focus on reducing cost andhardware components while supporting 3D models of localcloth designs to enhance scalability For the systemimplementation we used a Kinect V2 sensor equipped withseveral sensors that can detect the depth using an IR sensorcapture RGB images (video) using a camera sensor andcapture audio using an inbuilt microphone arraye Kinectsensor has an internal processor to process and manipulatedata before sending to the software development kit (SDK)[9] reducing the workload on the PC side e developedsystem consists of two applications e Windows Presen-tation Foundation (WPF) application captures user bodymeasurements and the Unity3D application overlays thecloth model on the user in front of the device e necessarybody parameters for clothing such as height shoulderlength arm length inseam and neck-to-hip length arecalculated incorporating the Kinect skeleton joints ecomplex body parameters such as perimeters at the chestwaist hip thigh and knee are calculated using informationobtained from the depth sensor of the Kinect e appli-cability of a single depth sensor to measure human bodyparameters was also investigated At this stage selecting thesize category of cloth depends on the correct measurementsof the body parameters mentioned above Once the mea-surements are taken the system will identify the gender ofthe user by the developed Unity application with cognitiveservice Subsequently the filtered 3D cloth models will bepopulated on a sliding bar where the user can try-on theavailable models virtually using gesture commands eselected model has been overlaid on top of the user on thevideo stream in real time Results indicate that a single depthsensor can be used successfully for the proposed augmentedreality virtual dressing room

In the next section we give a detailed literature survey ona virtual dressing room carried out using different tech-niques Next we provide our comprehensive work proposedin this research in two steps noncontact human body pa-rameter measurements and real-time simulation of 3Dclothes e experimental results are discussed in the nextsection and finally the conclusions and further directionsare provided

2 Related Work

In recent literature several technologies have been proposedfor implementing virtual dressing rooms using webcamscamera arrays and depth sensors In 2006 Ehara and Saito[10] used a web camera to overlay user selected texture on

surfaces on T-shirts virtually e proposed method wasimplemented with image-based systems incorporatingprelearned algorithms and matching them with the capturedimage of a user standing in front of a blue screen Proto-psaltou et al [11] have revealed an Internet-based virtualfitting room that makes use of standard body measurementsto create virtual 3D bodies and markers for motion cap-turing e major drawback of the above techniques is theinability to do real-time simulations A virtual fitting roombased on Android mobile devices was proposed by GarciaMartin andOruklu in 2012 [12] Amobile phone camera wasused as the main sensor and no other equipment needed forthe proposed work Since a mobile device is equipped withlimited storage and processing power the ability to processreal-time 3D simulations is not possible yet up to a satis-factory level Hauswiesner et al implemented a virtualdressing room using several cameras [13] where a user is notrequired to be in a specific viewpoint or a pose to interactwith the system is was not possible with a single cameraimplementation Graphical Processing Unit- (GPU-) basedefficient algorithms were used to develop this systemallowing real-time 3D processing

Research work related to virtual dressing rooms wasaccelerated with the high usage of depth sensors at theconsumable level In this aspect a virtual try-on system wasintroduced using Microsoft Kinect V1 and HD camera byGiovanni et al in 2012 [14] e HD camera was requireddue to the insufficient resolution RGB camera of the KinectV1 ey modeled 115 3D clothing items for the proposedsystem On their review on ldquoAugmented Reality Platformsfor Virtual Fitting Roomsrdquo KinectShop [15] Bodymetrics[16] ldquoImagine atrdquo [17] and Fitnect [6] were indicated asthe successful applications in the virtual fitting room in-dustry [15]

Furthermore a Kinect sensor was used in KinectShopand Fitnect applications where a user can try-on a 3D clothmodel in real-time Eight PrimeSense depth sensors wereused in the Bodymetrics application for 3D cloth simulationaccording to the movement of the user In 2014 UmutGultepe and Gudukbay proposed a real-time 3D clothsimulation on a virtual avatar in which the necessary heightsand widths for scaling the avatar and the cloth were obtainedusing depth map and skeleton joints [18] obtained fromdepth sensors Miri Kim and Kim Cheeyong proposed aldquoMagic Mirrorrdquo system which makes use of a depth sensor toobtain real-time user body parameters and compose the 3Doutfits accordingly Apart from cloth simulation this systemfacilitates hair and make-up functions [19]

Although the implementation of a virtual dressing roomusing a single depth sensor has been already attempted[6 18ndash20] these researches did not comprise complex 3Dbody parameters such as perimeters at chest waist hipthigh and kneeey are essential for the selection of the sizecategory for a user In our proposed work we incorporatedcomplex body parameters mentioned above for filtering thecloth sizes with an automated gender identification based ona captured frame of the user Hence the developed systemfilters suitable apparel designs for a user more effectively andefficiently

2 Advances in Human-Computer Interaction

3 Implementation

e developed augmented reality-based virtual fitting roomhas two main processing stages noncontact user body pa-rameter measurements and overlaying three-dimensional(3D) cloths on the user in real time Microsoft Kinect V2depth sensor was used to gather necessary user body pa-rameters Unity3D game engine was incorporated foroverlaying the 3D cloth models on the user e overview ofthe proposed system is shown in Figure 1 e genderidentification feature is included in the proposed work tofilter the 3D cloth models available for a particular customerwhich suits the body parameters e gesture recognitioncapability built into the system enables virtual try-on ofdifferent clothes

31 Body Parameter Measuring Application In order tocapture the user body parameters using the depth sensor wehave implemented a C Windows Presentation Foundation(WPF) application is application gathers the necessarybody parameters of the user who stands in front of theKinect sensor device Once the application is started theKinect sensor begins to capture the environment data

through its RGB camera and IR depth sensor Subsequentlythe sensor middleware removes background noise andidentifies the moving objects based on image processingtechniques [21] e application guides the user to maintainthe T-pose in order to initialize the body parameter mea-surement process (Figure 2) e necessary body parametersfor clothing such as height shoulder-length arm lengthinseam and neck-to-hip length and appropriate skeletonjointsrsquo distances (Figure 3) were calculated using directKinect application programming interfaces (APIs) eKinect V2 sensor captures up to 25 skeleton joints(Figure 4(a)) for human body mapping where it utilizes bothRGB color space and IR depth space information [22] erequired skeleton joints were considered among these 25skeleton joints for calculating the necessary body param-eters mentioned above

e camera space skeleton joint coordinates were used tocalculate the length of a bone segment in the given equation(1) As an example to calculate the shoulder length distanceof ldquoright-spine shoulderrdquo bone and ldquoleft-spine shoulderrdquo wasadded as indicated in red color bone segments in Figure 2 In[15] the authors discussed the process of calculating thesebody parameter lengths in detail

length of a bone segment

xjnminus xj(n+1)

1113874 11138752

+ yjnminus yj(n+1)

1113874 11138752

+ zjnminus zj(n+1)

1113874 11138752

1113971

(1)

where xminus yminus and z minus camera space coordinates in metersgiven by the Kinect API j minus jthth bone segment and nminus

nth skeleton pointFigure 4(b) shows the camera space coordinate system of

Kinect e origin of the camera space system is located atthe center of the IR sensor (emitter and receiver) [23]Accordingly the bone segment length can be calculated inmeters using equation (1)

Other than the abovementioned body parameters 3Dparameters such as chest waist hip thigh and knee cir-cumferences are required for garmenting In practice thechest circumference and the waist of a male are measuredaround 1-2 inches below the armpit and around one inchbelow the navel respectively On the other hand whenmeasuring the chest circumference of a female the mea-surement is taken around the bust at the apex e femalewaist measurement is taken around the smallest naturalpoint at the waist e perimeter of the hip is measuredaround the largest point at the hip for both genders [24] Inthis proposed work the depth sensor information of theKinect sensor was used to obtain body parameters at thechest waist hip thigh and knee We obtained the bodyindex image by removing the background of the userFigure 5 shows the body index image Here the relevantdepth pixels of the user give the depth of each pixel withrespect to the sensor origin In order to obtain the perimeterat any given y coordinate depth information of the front andrear view of the user is required erefore the developed

system scans the front and rear of the user in two steps Atfirst the user has to maintain a T-pose facing the sensor for5 seconds In the next 5 seconds the user has to maintain aT-pose facing away from the sensor At each step depthinformation along y coordinates of chest waist hip thighand knee were recorded e y coordinates of the chestwaist hip thigh and knee points were calculated as followsby incorporating the skeleton joints with actual body pointobservations

Ychest YSpineShoulder + YSpineMid minus YSpineShoulder1113872 1113873 times68 (2)

Ywaist YSpineMid + YSpineBase minus YSpineMid1113872 1113873 times38 (3)

Yhip YSpineMid + YSpineBase minus YSpineMid1113872 1113873 (4)

Ythigh YSpineBase + YHipRight minus YKneeRight1113872 1113873 times12 (5)

Yknee YKneeRight (6)

Here we obtained skeleton joints in the physical world(camera space) coordinates Hence equations (2) to (6) givethe y position in meters Each of the above equationrsquos co-efficients has been derived empirically with manual andapplication measurements After converting camera coor-dinates into color space coordinates (using Kinect SDKrsquos

Advances in Human-Computer Interaction 3

CoordinateMapper API) corresponding y positions in colorspace were calculated to visualize chest waist hip thigh andknee levels which are shown in horizontal red arrows in thebody index image (Figure 5) To calculate the perimeter atany given y coordinate x and z coordinates along the definedline must be obtained in meters e body index imageshown in Figure 5 is used to obtain the relevant x pixelcoordinates (color space) along red arrow lines at each levelcorresponding to the usersrsquo bodies

e background of the depth image was removed byconsidering the body index image (Figure 5) while matching

them to correct coordinates with the help of Coor-dinateMapper API Using these virtual horizontal lines (asmarked in Figure 6) which were bound to the userrsquos body xand z coordinates were obtained for chest waist hip thighand knee (See Figure 6) Accordingly relevant (x y z)camera space coordinates were obtained in meters enequation (7) was used to calculate the perimeter at each levelincorporating the front and rear sides of the user e ac-curacy of the calculated perimeter values is further improvedby using the average of two nearby horizontal lines at eachposition mentioned above Figure 6 shows these lines withcurved arrows at each level for the front view of the usersrsquobody

perimeter of a curve 1113944mminus1

n1

ypnminus yp(n+1)

1113874 11138752

+ xpnminus xp(n+1)

1113874 11138752

+ zpnminus zp(n+1)

1113874 11138752

1113971

(7)

WPF applicationndashbody parameter measurement application

Unity3D applicationndash3D cloth overlaying application

Collecting depth and user information

User body metrics gathering

Saving user body parameters

Gender identificationLoad user body parameters

Filter available 3D models

Collecting depth and user information

Simulate real-time 3D overlaying

Figure 1 Augmented reality-based virtual fitting room overview

Figure 2 e T-pose

Figure 3 Lengths-(1) Arm (2) Shoulder (3) Neck-hip (4) Inseam

4 Advances in Human-Computer Interaction

where m is the maximum number of pixels for a givenhorizontalvertical line in color space obtained from thebody index image and pn is the nth pixel and pn+1 is the n+1thpixele precise measurements were obtained by taking themedian value of each measurement for the front and rearsides e median is used to remove the outliers which mayoccur due to the unwanted pixels of the Kinect Finally theperimeter at each position was obtained by summing therelevant front and rear perimeters Figure 7 shows thecombined front and rear perimeters at the chest waist hipthigh and knee for a tested body (front view) after carryingout the necessary adjustment for aligning to a single verticalaxis

e values obtained for body parameters are used formodel categorization e developed WPF application fornoncontact human body parameter measurement finallyreturns the calculated size category for tops ese measuredparameters were injected to the Unity3D application to filterthe 3D cloth models e size category for tops was selectedas per the size categories in Table 1 At this stage weconsidered the largest perimeter value among the threeperimeters relevant to the chest waist and hip to make adecision on the size category that best suits the customer

32 Unity3D Application-3D Cloth Overlaying Applicatione recorded body parameters are injected to the 3D clothoverlaying application developed using the Unity3D gameengine Initially according to the body parameter mea-surements the suitable category of the apparel for thecustomer was identified as per Table 1 To identify the genderof the user the Unity3D application uses cognitive service[25] developed by Microsoft and hosted in Azure cognitiveservices is machine learning vision-based service canreturn a confidence score level for a person in an image usingan internal database Using the service it can identify thegender with a few other parameters such as emotion posesmile and facial hair along with 27 landmarks for each face inthe image In the WPF application once the Kinect SDKcompletes user identification a frame of an image with thebody is submitted to the cognitive service to identify the

(a) (b)

Figure 4 (a) Kinect V2 skeleton joints [22] (b) e camera space coordinate system [23]

Figure 6 Complex 3D garment measurements front perimeters atchest waist hip thigh knee and wrist

Figure 5 Background removed body index image with fronthorizontal curve in depth space

Advances in Human-Computer Interaction 5

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 2: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

Hence RGB-D sensors are widely used for developmentsRobot navigation [4 5] home automation [6] and virtuallaboratories for education [7 8] are some of the researchstudies published recently where RGB-D sensors are used forthe development

In this proposed work our main goal is to implement apractical solution that can provide greater satisfaction iscan be achieved by implementing an augmented realityexperience in the most realistic way including physicsmotions to the dress according to the movements made bythe customer using a single depth sensor e proposedsystem is developed with the focus on reducing cost andhardware components while supporting 3D models of localcloth designs to enhance scalability For the systemimplementation we used a Kinect V2 sensor equipped withseveral sensors that can detect the depth using an IR sensorcapture RGB images (video) using a camera sensor andcapture audio using an inbuilt microphone arraye Kinectsensor has an internal processor to process and manipulatedata before sending to the software development kit (SDK)[9] reducing the workload on the PC side e developedsystem consists of two applications e Windows Presen-tation Foundation (WPF) application captures user bodymeasurements and the Unity3D application overlays thecloth model on the user in front of the device e necessarybody parameters for clothing such as height shoulderlength arm length inseam and neck-to-hip length arecalculated incorporating the Kinect skeleton joints ecomplex body parameters such as perimeters at the chestwaist hip thigh and knee are calculated using informationobtained from the depth sensor of the Kinect e appli-cability of a single depth sensor to measure human bodyparameters was also investigated At this stage selecting thesize category of cloth depends on the correct measurementsof the body parameters mentioned above Once the mea-surements are taken the system will identify the gender ofthe user by the developed Unity application with cognitiveservice Subsequently the filtered 3D cloth models will bepopulated on a sliding bar where the user can try-on theavailable models virtually using gesture commands eselected model has been overlaid on top of the user on thevideo stream in real time Results indicate that a single depthsensor can be used successfully for the proposed augmentedreality virtual dressing room

In the next section we give a detailed literature survey ona virtual dressing room carried out using different tech-niques Next we provide our comprehensive work proposedin this research in two steps noncontact human body pa-rameter measurements and real-time simulation of 3Dclothes e experimental results are discussed in the nextsection and finally the conclusions and further directionsare provided

2 Related Work

In recent literature several technologies have been proposedfor implementing virtual dressing rooms using webcamscamera arrays and depth sensors In 2006 Ehara and Saito[10] used a web camera to overlay user selected texture on

surfaces on T-shirts virtually e proposed method wasimplemented with image-based systems incorporatingprelearned algorithms and matching them with the capturedimage of a user standing in front of a blue screen Proto-psaltou et al [11] have revealed an Internet-based virtualfitting room that makes use of standard body measurementsto create virtual 3D bodies and markers for motion cap-turing e major drawback of the above techniques is theinability to do real-time simulations A virtual fitting roombased on Android mobile devices was proposed by GarciaMartin andOruklu in 2012 [12] Amobile phone camera wasused as the main sensor and no other equipment needed forthe proposed work Since a mobile device is equipped withlimited storage and processing power the ability to processreal-time 3D simulations is not possible yet up to a satis-factory level Hauswiesner et al implemented a virtualdressing room using several cameras [13] where a user is notrequired to be in a specific viewpoint or a pose to interactwith the system is was not possible with a single cameraimplementation Graphical Processing Unit- (GPU-) basedefficient algorithms were used to develop this systemallowing real-time 3D processing

Research work related to virtual dressing rooms wasaccelerated with the high usage of depth sensors at theconsumable level In this aspect a virtual try-on system wasintroduced using Microsoft Kinect V1 and HD camera byGiovanni et al in 2012 [14] e HD camera was requireddue to the insufficient resolution RGB camera of the KinectV1 ey modeled 115 3D clothing items for the proposedsystem On their review on ldquoAugmented Reality Platformsfor Virtual Fitting Roomsrdquo KinectShop [15] Bodymetrics[16] ldquoImagine atrdquo [17] and Fitnect [6] were indicated asthe successful applications in the virtual fitting room in-dustry [15]

Furthermore a Kinect sensor was used in KinectShopand Fitnect applications where a user can try-on a 3D clothmodel in real-time Eight PrimeSense depth sensors wereused in the Bodymetrics application for 3D cloth simulationaccording to the movement of the user In 2014 UmutGultepe and Gudukbay proposed a real-time 3D clothsimulation on a virtual avatar in which the necessary heightsand widths for scaling the avatar and the cloth were obtainedusing depth map and skeleton joints [18] obtained fromdepth sensors Miri Kim and Kim Cheeyong proposed aldquoMagic Mirrorrdquo system which makes use of a depth sensor toobtain real-time user body parameters and compose the 3Doutfits accordingly Apart from cloth simulation this systemfacilitates hair and make-up functions [19]

Although the implementation of a virtual dressing roomusing a single depth sensor has been already attempted[6 18ndash20] these researches did not comprise complex 3Dbody parameters such as perimeters at chest waist hipthigh and kneeey are essential for the selection of the sizecategory for a user In our proposed work we incorporatedcomplex body parameters mentioned above for filtering thecloth sizes with an automated gender identification based ona captured frame of the user Hence the developed systemfilters suitable apparel designs for a user more effectively andefficiently

2 Advances in Human-Computer Interaction

3 Implementation

e developed augmented reality-based virtual fitting roomhas two main processing stages noncontact user body pa-rameter measurements and overlaying three-dimensional(3D) cloths on the user in real time Microsoft Kinect V2depth sensor was used to gather necessary user body pa-rameters Unity3D game engine was incorporated foroverlaying the 3D cloth models on the user e overview ofthe proposed system is shown in Figure 1 e genderidentification feature is included in the proposed work tofilter the 3D cloth models available for a particular customerwhich suits the body parameters e gesture recognitioncapability built into the system enables virtual try-on ofdifferent clothes

31 Body Parameter Measuring Application In order tocapture the user body parameters using the depth sensor wehave implemented a C Windows Presentation Foundation(WPF) application is application gathers the necessarybody parameters of the user who stands in front of theKinect sensor device Once the application is started theKinect sensor begins to capture the environment data

through its RGB camera and IR depth sensor Subsequentlythe sensor middleware removes background noise andidentifies the moving objects based on image processingtechniques [21] e application guides the user to maintainthe T-pose in order to initialize the body parameter mea-surement process (Figure 2) e necessary body parametersfor clothing such as height shoulder-length arm lengthinseam and neck-to-hip length and appropriate skeletonjointsrsquo distances (Figure 3) were calculated using directKinect application programming interfaces (APIs) eKinect V2 sensor captures up to 25 skeleton joints(Figure 4(a)) for human body mapping where it utilizes bothRGB color space and IR depth space information [22] erequired skeleton joints were considered among these 25skeleton joints for calculating the necessary body param-eters mentioned above

e camera space skeleton joint coordinates were used tocalculate the length of a bone segment in the given equation(1) As an example to calculate the shoulder length distanceof ldquoright-spine shoulderrdquo bone and ldquoleft-spine shoulderrdquo wasadded as indicated in red color bone segments in Figure 2 In[15] the authors discussed the process of calculating thesebody parameter lengths in detail

length of a bone segment

xjnminus xj(n+1)

1113874 11138752

+ yjnminus yj(n+1)

1113874 11138752

+ zjnminus zj(n+1)

1113874 11138752

1113971

(1)

where xminus yminus and z minus camera space coordinates in metersgiven by the Kinect API j minus jthth bone segment and nminus

nth skeleton pointFigure 4(b) shows the camera space coordinate system of

Kinect e origin of the camera space system is located atthe center of the IR sensor (emitter and receiver) [23]Accordingly the bone segment length can be calculated inmeters using equation (1)

Other than the abovementioned body parameters 3Dparameters such as chest waist hip thigh and knee cir-cumferences are required for garmenting In practice thechest circumference and the waist of a male are measuredaround 1-2 inches below the armpit and around one inchbelow the navel respectively On the other hand whenmeasuring the chest circumference of a female the mea-surement is taken around the bust at the apex e femalewaist measurement is taken around the smallest naturalpoint at the waist e perimeter of the hip is measuredaround the largest point at the hip for both genders [24] Inthis proposed work the depth sensor information of theKinect sensor was used to obtain body parameters at thechest waist hip thigh and knee We obtained the bodyindex image by removing the background of the userFigure 5 shows the body index image Here the relevantdepth pixels of the user give the depth of each pixel withrespect to the sensor origin In order to obtain the perimeterat any given y coordinate depth information of the front andrear view of the user is required erefore the developed

system scans the front and rear of the user in two steps Atfirst the user has to maintain a T-pose facing the sensor for5 seconds In the next 5 seconds the user has to maintain aT-pose facing away from the sensor At each step depthinformation along y coordinates of chest waist hip thighand knee were recorded e y coordinates of the chestwaist hip thigh and knee points were calculated as followsby incorporating the skeleton joints with actual body pointobservations

Ychest YSpineShoulder + YSpineMid minus YSpineShoulder1113872 1113873 times68 (2)

Ywaist YSpineMid + YSpineBase minus YSpineMid1113872 1113873 times38 (3)

Yhip YSpineMid + YSpineBase minus YSpineMid1113872 1113873 (4)

Ythigh YSpineBase + YHipRight minus YKneeRight1113872 1113873 times12 (5)

Yknee YKneeRight (6)

Here we obtained skeleton joints in the physical world(camera space) coordinates Hence equations (2) to (6) givethe y position in meters Each of the above equationrsquos co-efficients has been derived empirically with manual andapplication measurements After converting camera coor-dinates into color space coordinates (using Kinect SDKrsquos

Advances in Human-Computer Interaction 3

CoordinateMapper API) corresponding y positions in colorspace were calculated to visualize chest waist hip thigh andknee levels which are shown in horizontal red arrows in thebody index image (Figure 5) To calculate the perimeter atany given y coordinate x and z coordinates along the definedline must be obtained in meters e body index imageshown in Figure 5 is used to obtain the relevant x pixelcoordinates (color space) along red arrow lines at each levelcorresponding to the usersrsquo bodies

e background of the depth image was removed byconsidering the body index image (Figure 5) while matching

them to correct coordinates with the help of Coor-dinateMapper API Using these virtual horizontal lines (asmarked in Figure 6) which were bound to the userrsquos body xand z coordinates were obtained for chest waist hip thighand knee (See Figure 6) Accordingly relevant (x y z)camera space coordinates were obtained in meters enequation (7) was used to calculate the perimeter at each levelincorporating the front and rear sides of the user e ac-curacy of the calculated perimeter values is further improvedby using the average of two nearby horizontal lines at eachposition mentioned above Figure 6 shows these lines withcurved arrows at each level for the front view of the usersrsquobody

perimeter of a curve 1113944mminus1

n1

ypnminus yp(n+1)

1113874 11138752

+ xpnminus xp(n+1)

1113874 11138752

+ zpnminus zp(n+1)

1113874 11138752

1113971

(7)

WPF applicationndashbody parameter measurement application

Unity3D applicationndash3D cloth overlaying application

Collecting depth and user information

User body metrics gathering

Saving user body parameters

Gender identificationLoad user body parameters

Filter available 3D models

Collecting depth and user information

Simulate real-time 3D overlaying

Figure 1 Augmented reality-based virtual fitting room overview

Figure 2 e T-pose

Figure 3 Lengths-(1) Arm (2) Shoulder (3) Neck-hip (4) Inseam

4 Advances in Human-Computer Interaction

where m is the maximum number of pixels for a givenhorizontalvertical line in color space obtained from thebody index image and pn is the nth pixel and pn+1 is the n+1thpixele precise measurements were obtained by taking themedian value of each measurement for the front and rearsides e median is used to remove the outliers which mayoccur due to the unwanted pixels of the Kinect Finally theperimeter at each position was obtained by summing therelevant front and rear perimeters Figure 7 shows thecombined front and rear perimeters at the chest waist hipthigh and knee for a tested body (front view) after carryingout the necessary adjustment for aligning to a single verticalaxis

e values obtained for body parameters are used formodel categorization e developed WPF application fornoncontact human body parameter measurement finallyreturns the calculated size category for tops ese measuredparameters were injected to the Unity3D application to filterthe 3D cloth models e size category for tops was selectedas per the size categories in Table 1 At this stage weconsidered the largest perimeter value among the threeperimeters relevant to the chest waist and hip to make adecision on the size category that best suits the customer

32 Unity3D Application-3D Cloth Overlaying Applicatione recorded body parameters are injected to the 3D clothoverlaying application developed using the Unity3D gameengine Initially according to the body parameter mea-surements the suitable category of the apparel for thecustomer was identified as per Table 1 To identify the genderof the user the Unity3D application uses cognitive service[25] developed by Microsoft and hosted in Azure cognitiveservices is machine learning vision-based service canreturn a confidence score level for a person in an image usingan internal database Using the service it can identify thegender with a few other parameters such as emotion posesmile and facial hair along with 27 landmarks for each face inthe image In the WPF application once the Kinect SDKcompletes user identification a frame of an image with thebody is submitted to the cognitive service to identify the

(a) (b)

Figure 4 (a) Kinect V2 skeleton joints [22] (b) e camera space coordinate system [23]

Figure 6 Complex 3D garment measurements front perimeters atchest waist hip thigh knee and wrist

Figure 5 Background removed body index image with fronthorizontal curve in depth space

Advances in Human-Computer Interaction 5

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 3: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

3 Implementation

e developed augmented reality-based virtual fitting roomhas two main processing stages noncontact user body pa-rameter measurements and overlaying three-dimensional(3D) cloths on the user in real time Microsoft Kinect V2depth sensor was used to gather necessary user body pa-rameters Unity3D game engine was incorporated foroverlaying the 3D cloth models on the user e overview ofthe proposed system is shown in Figure 1 e genderidentification feature is included in the proposed work tofilter the 3D cloth models available for a particular customerwhich suits the body parameters e gesture recognitioncapability built into the system enables virtual try-on ofdifferent clothes

31 Body Parameter Measuring Application In order tocapture the user body parameters using the depth sensor wehave implemented a C Windows Presentation Foundation(WPF) application is application gathers the necessarybody parameters of the user who stands in front of theKinect sensor device Once the application is started theKinect sensor begins to capture the environment data

through its RGB camera and IR depth sensor Subsequentlythe sensor middleware removes background noise andidentifies the moving objects based on image processingtechniques [21] e application guides the user to maintainthe T-pose in order to initialize the body parameter mea-surement process (Figure 2) e necessary body parametersfor clothing such as height shoulder-length arm lengthinseam and neck-to-hip length and appropriate skeletonjointsrsquo distances (Figure 3) were calculated using directKinect application programming interfaces (APIs) eKinect V2 sensor captures up to 25 skeleton joints(Figure 4(a)) for human body mapping where it utilizes bothRGB color space and IR depth space information [22] erequired skeleton joints were considered among these 25skeleton joints for calculating the necessary body param-eters mentioned above

e camera space skeleton joint coordinates were used tocalculate the length of a bone segment in the given equation(1) As an example to calculate the shoulder length distanceof ldquoright-spine shoulderrdquo bone and ldquoleft-spine shoulderrdquo wasadded as indicated in red color bone segments in Figure 2 In[15] the authors discussed the process of calculating thesebody parameter lengths in detail

length of a bone segment

xjnminus xj(n+1)

1113874 11138752

+ yjnminus yj(n+1)

1113874 11138752

+ zjnminus zj(n+1)

1113874 11138752

1113971

(1)

where xminus yminus and z minus camera space coordinates in metersgiven by the Kinect API j minus jthth bone segment and nminus

nth skeleton pointFigure 4(b) shows the camera space coordinate system of

Kinect e origin of the camera space system is located atthe center of the IR sensor (emitter and receiver) [23]Accordingly the bone segment length can be calculated inmeters using equation (1)

Other than the abovementioned body parameters 3Dparameters such as chest waist hip thigh and knee cir-cumferences are required for garmenting In practice thechest circumference and the waist of a male are measuredaround 1-2 inches below the armpit and around one inchbelow the navel respectively On the other hand whenmeasuring the chest circumference of a female the mea-surement is taken around the bust at the apex e femalewaist measurement is taken around the smallest naturalpoint at the waist e perimeter of the hip is measuredaround the largest point at the hip for both genders [24] Inthis proposed work the depth sensor information of theKinect sensor was used to obtain body parameters at thechest waist hip thigh and knee We obtained the bodyindex image by removing the background of the userFigure 5 shows the body index image Here the relevantdepth pixels of the user give the depth of each pixel withrespect to the sensor origin In order to obtain the perimeterat any given y coordinate depth information of the front andrear view of the user is required erefore the developed

system scans the front and rear of the user in two steps Atfirst the user has to maintain a T-pose facing the sensor for5 seconds In the next 5 seconds the user has to maintain aT-pose facing away from the sensor At each step depthinformation along y coordinates of chest waist hip thighand knee were recorded e y coordinates of the chestwaist hip thigh and knee points were calculated as followsby incorporating the skeleton joints with actual body pointobservations

Ychest YSpineShoulder + YSpineMid minus YSpineShoulder1113872 1113873 times68 (2)

Ywaist YSpineMid + YSpineBase minus YSpineMid1113872 1113873 times38 (3)

Yhip YSpineMid + YSpineBase minus YSpineMid1113872 1113873 (4)

Ythigh YSpineBase + YHipRight minus YKneeRight1113872 1113873 times12 (5)

Yknee YKneeRight (6)

Here we obtained skeleton joints in the physical world(camera space) coordinates Hence equations (2) to (6) givethe y position in meters Each of the above equationrsquos co-efficients has been derived empirically with manual andapplication measurements After converting camera coor-dinates into color space coordinates (using Kinect SDKrsquos

Advances in Human-Computer Interaction 3

CoordinateMapper API) corresponding y positions in colorspace were calculated to visualize chest waist hip thigh andknee levels which are shown in horizontal red arrows in thebody index image (Figure 5) To calculate the perimeter atany given y coordinate x and z coordinates along the definedline must be obtained in meters e body index imageshown in Figure 5 is used to obtain the relevant x pixelcoordinates (color space) along red arrow lines at each levelcorresponding to the usersrsquo bodies

e background of the depth image was removed byconsidering the body index image (Figure 5) while matching

them to correct coordinates with the help of Coor-dinateMapper API Using these virtual horizontal lines (asmarked in Figure 6) which were bound to the userrsquos body xand z coordinates were obtained for chest waist hip thighand knee (See Figure 6) Accordingly relevant (x y z)camera space coordinates were obtained in meters enequation (7) was used to calculate the perimeter at each levelincorporating the front and rear sides of the user e ac-curacy of the calculated perimeter values is further improvedby using the average of two nearby horizontal lines at eachposition mentioned above Figure 6 shows these lines withcurved arrows at each level for the front view of the usersrsquobody

perimeter of a curve 1113944mminus1

n1

ypnminus yp(n+1)

1113874 11138752

+ xpnminus xp(n+1)

1113874 11138752

+ zpnminus zp(n+1)

1113874 11138752

1113971

(7)

WPF applicationndashbody parameter measurement application

Unity3D applicationndash3D cloth overlaying application

Collecting depth and user information

User body metrics gathering

Saving user body parameters

Gender identificationLoad user body parameters

Filter available 3D models

Collecting depth and user information

Simulate real-time 3D overlaying

Figure 1 Augmented reality-based virtual fitting room overview

Figure 2 e T-pose

Figure 3 Lengths-(1) Arm (2) Shoulder (3) Neck-hip (4) Inseam

4 Advances in Human-Computer Interaction

where m is the maximum number of pixels for a givenhorizontalvertical line in color space obtained from thebody index image and pn is the nth pixel and pn+1 is the n+1thpixele precise measurements were obtained by taking themedian value of each measurement for the front and rearsides e median is used to remove the outliers which mayoccur due to the unwanted pixels of the Kinect Finally theperimeter at each position was obtained by summing therelevant front and rear perimeters Figure 7 shows thecombined front and rear perimeters at the chest waist hipthigh and knee for a tested body (front view) after carryingout the necessary adjustment for aligning to a single verticalaxis

e values obtained for body parameters are used formodel categorization e developed WPF application fornoncontact human body parameter measurement finallyreturns the calculated size category for tops ese measuredparameters were injected to the Unity3D application to filterthe 3D cloth models e size category for tops was selectedas per the size categories in Table 1 At this stage weconsidered the largest perimeter value among the threeperimeters relevant to the chest waist and hip to make adecision on the size category that best suits the customer

32 Unity3D Application-3D Cloth Overlaying Applicatione recorded body parameters are injected to the 3D clothoverlaying application developed using the Unity3D gameengine Initially according to the body parameter mea-surements the suitable category of the apparel for thecustomer was identified as per Table 1 To identify the genderof the user the Unity3D application uses cognitive service[25] developed by Microsoft and hosted in Azure cognitiveservices is machine learning vision-based service canreturn a confidence score level for a person in an image usingan internal database Using the service it can identify thegender with a few other parameters such as emotion posesmile and facial hair along with 27 landmarks for each face inthe image In the WPF application once the Kinect SDKcompletes user identification a frame of an image with thebody is submitted to the cognitive service to identify the

(a) (b)

Figure 4 (a) Kinect V2 skeleton joints [22] (b) e camera space coordinate system [23]

Figure 6 Complex 3D garment measurements front perimeters atchest waist hip thigh knee and wrist

Figure 5 Background removed body index image with fronthorizontal curve in depth space

Advances in Human-Computer Interaction 5

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 4: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

CoordinateMapper API) corresponding y positions in colorspace were calculated to visualize chest waist hip thigh andknee levels which are shown in horizontal red arrows in thebody index image (Figure 5) To calculate the perimeter atany given y coordinate x and z coordinates along the definedline must be obtained in meters e body index imageshown in Figure 5 is used to obtain the relevant x pixelcoordinates (color space) along red arrow lines at each levelcorresponding to the usersrsquo bodies

e background of the depth image was removed byconsidering the body index image (Figure 5) while matching

them to correct coordinates with the help of Coor-dinateMapper API Using these virtual horizontal lines (asmarked in Figure 6) which were bound to the userrsquos body xand z coordinates were obtained for chest waist hip thighand knee (See Figure 6) Accordingly relevant (x y z)camera space coordinates were obtained in meters enequation (7) was used to calculate the perimeter at each levelincorporating the front and rear sides of the user e ac-curacy of the calculated perimeter values is further improvedby using the average of two nearby horizontal lines at eachposition mentioned above Figure 6 shows these lines withcurved arrows at each level for the front view of the usersrsquobody

perimeter of a curve 1113944mminus1

n1

ypnminus yp(n+1)

1113874 11138752

+ xpnminus xp(n+1)

1113874 11138752

+ zpnminus zp(n+1)

1113874 11138752

1113971

(7)

WPF applicationndashbody parameter measurement application

Unity3D applicationndash3D cloth overlaying application

Collecting depth and user information

User body metrics gathering

Saving user body parameters

Gender identificationLoad user body parameters

Filter available 3D models

Collecting depth and user information

Simulate real-time 3D overlaying

Figure 1 Augmented reality-based virtual fitting room overview

Figure 2 e T-pose

Figure 3 Lengths-(1) Arm (2) Shoulder (3) Neck-hip (4) Inseam

4 Advances in Human-Computer Interaction

where m is the maximum number of pixels for a givenhorizontalvertical line in color space obtained from thebody index image and pn is the nth pixel and pn+1 is the n+1thpixele precise measurements were obtained by taking themedian value of each measurement for the front and rearsides e median is used to remove the outliers which mayoccur due to the unwanted pixels of the Kinect Finally theperimeter at each position was obtained by summing therelevant front and rear perimeters Figure 7 shows thecombined front and rear perimeters at the chest waist hipthigh and knee for a tested body (front view) after carryingout the necessary adjustment for aligning to a single verticalaxis

e values obtained for body parameters are used formodel categorization e developed WPF application fornoncontact human body parameter measurement finallyreturns the calculated size category for tops ese measuredparameters were injected to the Unity3D application to filterthe 3D cloth models e size category for tops was selectedas per the size categories in Table 1 At this stage weconsidered the largest perimeter value among the threeperimeters relevant to the chest waist and hip to make adecision on the size category that best suits the customer

32 Unity3D Application-3D Cloth Overlaying Applicatione recorded body parameters are injected to the 3D clothoverlaying application developed using the Unity3D gameengine Initially according to the body parameter mea-surements the suitable category of the apparel for thecustomer was identified as per Table 1 To identify the genderof the user the Unity3D application uses cognitive service[25] developed by Microsoft and hosted in Azure cognitiveservices is machine learning vision-based service canreturn a confidence score level for a person in an image usingan internal database Using the service it can identify thegender with a few other parameters such as emotion posesmile and facial hair along with 27 landmarks for each face inthe image In the WPF application once the Kinect SDKcompletes user identification a frame of an image with thebody is submitted to the cognitive service to identify the

(a) (b)

Figure 4 (a) Kinect V2 skeleton joints [22] (b) e camera space coordinate system [23]

Figure 6 Complex 3D garment measurements front perimeters atchest waist hip thigh knee and wrist

Figure 5 Background removed body index image with fronthorizontal curve in depth space

Advances in Human-Computer Interaction 5

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 5: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

where m is the maximum number of pixels for a givenhorizontalvertical line in color space obtained from thebody index image and pn is the nth pixel and pn+1 is the n+1thpixele precise measurements were obtained by taking themedian value of each measurement for the front and rearsides e median is used to remove the outliers which mayoccur due to the unwanted pixels of the Kinect Finally theperimeter at each position was obtained by summing therelevant front and rear perimeters Figure 7 shows thecombined front and rear perimeters at the chest waist hipthigh and knee for a tested body (front view) after carryingout the necessary adjustment for aligning to a single verticalaxis

e values obtained for body parameters are used formodel categorization e developed WPF application fornoncontact human body parameter measurement finallyreturns the calculated size category for tops ese measuredparameters were injected to the Unity3D application to filterthe 3D cloth models e size category for tops was selectedas per the size categories in Table 1 At this stage weconsidered the largest perimeter value among the threeperimeters relevant to the chest waist and hip to make adecision on the size category that best suits the customer

32 Unity3D Application-3D Cloth Overlaying Applicatione recorded body parameters are injected to the 3D clothoverlaying application developed using the Unity3D gameengine Initially according to the body parameter mea-surements the suitable category of the apparel for thecustomer was identified as per Table 1 To identify the genderof the user the Unity3D application uses cognitive service[25] developed by Microsoft and hosted in Azure cognitiveservices is machine learning vision-based service canreturn a confidence score level for a person in an image usingan internal database Using the service it can identify thegender with a few other parameters such as emotion posesmile and facial hair along with 27 landmarks for each face inthe image In the WPF application once the Kinect SDKcompletes user identification a frame of an image with thebody is submitted to the cognitive service to identify the

(a) (b)

Figure 4 (a) Kinect V2 skeleton joints [22] (b) e camera space coordinate system [23]

Figure 6 Complex 3D garment measurements front perimeters atchest waist hip thigh knee and wrist

Figure 5 Background removed body index image with fronthorizontal curve in depth space

Advances in Human-Computer Interaction 5

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 6: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

gender of the personen the 3D clothmodels were filteredaccording to the gender and apparel size category esystem contains an internal 3D garment model databaseis database contains all the 3D cloth models with tagswhich indicate the gender and the apparel size categoryOnce the gender was recognized by the service 3D garmentmodels were filtered based on gender and the size categoryaccording to the availability of real cloths in the retail shop

Here the cloth model has used 3D designed softwareldquo3D Maxrdquo for rigging and creatingediting 3D cloth models[26] Once it was rigged the 3D model was imported to theUnity3D projecten using the configuration menu for thehumanoid model it was matched with the joints of thehuman model [27] e resulting model can be convertedinto a controllable 3D humanoid model

To create an animated or controllable model the selectedmodel needs to be rigged is helps in animating the clothmodel according to the skeleton movements Severalmethods have been used in the literature to achieve this suchas automated rigging methods [28 29] Instead of usingautomated methods we used a manual process to verify and

build the skeleton on top of the cloth model is is mainlydue to the requirement of a humanoid avatar whichmatchesto the skeleton Here we use only a cloth model instead of anavatar model We can create a rigged system in two differentways with 3D Max prebacked ldquosystem bipedrdquo and cus-tomized ldquobone IK chainrdquo We used the ldquoBone IK Chainrdquo(Figure 8) is needs to be a manual work such as aligningthe cloth model and skeleton such it has been dressed to areal human body

en necessary materials can be applied to the model asthe material design of the model (Figure 9) Subsequentlythe cloth model is bound as the skin of the skeleton that canbe saved as lowastfbx [30] extension before it is imported to theUnity3D as a humanoid cloth model Finally the rigged 3Dcloth was bound to the skeleton provided by the Kinectdevice using Unity3D where the Kinect device and KinectSDK for Unity3D were used to identify the person andretrieve depth information e Kinect API was used forfurther manipulation and real-time overlaying ese clothsare populated on the side of the screen on a slide-bar Usingthe hand gesture (swapping left to right or vice versa) it canselect different cloth model e developed system alsocontains an autoexiting (10 seconds) mechanism if the useris not present in front of the Kinect device

4 Results and Discussion

e WPF application was developed to obtain noncontactbody parameter measurements is application gathers thenecessary body parameters of the user who stand in front ofthe Kinect sensor device In this proposed work the 3Dcomplex parameters such as perimeters of chest waist hipthigh and knee were considered for selecting the apparel sizecategory and rigging the 3Dmodel into the user body in real-time Hence the application guides the user to maintain aT-pose facing towards and away from the sensor for 5seconds each in order to capture the necessary depth in-formation Figure 10 shows the developed WPF application

Figure 7 Complex 3D garment measurements perimeters atchest waist hip thigh and knee (units in cm)

Table 1 Unisex size category for topslowast [18]

XXS XS S M L XL XXLChest 32ndash34 34ndash36 36ndash38 39ndash41 42ndash44 45ndash47 48ndash50Waist 23ndash25 26ndash28 29ndash31 32ndash34 35ndash39 38ndash40 41ndash43Hip 28ndash33 31ndash36 34ndash39 37ndash42 40ndash45 43ndash48 46ndash51lowastAll units in inches

Figure 8 Aligned skeleton and the cloth model

6 Advances in Human-Computer Interaction

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 7: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

user interface After capturing necessary 3D information thenecessary body parameters were calculated using equations(1)ndash(7) en the apparel size category was derived Finallythe overlaying application is launched with all measureparameters to filter the 3D cloth models based on gender

e output of the developed system is presented inFigure 11 Here the user can select and virtually try on theapparels available in the retail shop filtered according to hisor her body parameters e items on the population list canbe selected using the gesture identification features inte-grated into the solution

Table 2 presents the sample of human body parametermeasurements obtained by the developed system andmanually for eleven males Manual measurements weretaken by a single person using a measuring tape e errorfor each case was calculated as per the following equation

error (manualmeasurement minus applicationmeasurement)

(manualmeasurement)times 100

(8)

By considering all outliers and errors the average errorof each measurement will lie below 10

Figure 9 Adding material to the cloth skinned rig

7238

1905

2190

3239

2267

4426

4111

4315

Calculated body parameters

Height (inches)

Neck-to-hip distance (inches)

Inseam distance (inches)

Arm length (inches)

Shoulder length (inches)

Chest circumference (inches)

Waist circumference (inches)

Hip circumference (inches)

T-pose status rarr stay put

VDR

Record

Figure 10 WPF application-noncontact body parameter measurement

Advances in Human-Computer Interaction 7

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 8: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

If we take 10 as the boundary value according toTable 2 we can see the user M6 has several extreme valueswhile other users have lesser error percentages Severalreasons produce a high error percentage in some mea-surements obtained through the developed systeme userswearing baggylarge type topstrousers when taking themeasurements is one of the main reasons for deviating the

results from the manual measurements As an exampleFigure 12 shows a body index image of a person wearing abaggy top and a pair of trousers where the boundary of theuser is deviated from the actual userrsquos boy due to the dress Inaddition the dress may have wrinklese errors introducedby these factors during x and z coordinate measurementswill lead to deviations from the manual measurements e

Figure 11 Virtual dressing room for 3D clothes simulation

Table 2 Manual measurements and noncontact body parameter measurements (units in inches)

UserLength measurements Perimeter measurements

Shoulder Neck to hip Arm Inseam Chest Hip Waist igh KneeManual

M117 305 255 2975 3225 375 31 22 145

Application value 1688 2841 2497 2893 3515 3548 3124 1956 1393Relative error () 071 685 208 276 899 539 077 1109 393Manual

M21675 2825 245 295 335 3625 325 20 135

Application value 1669 285 2502 2899 3451 3655 3086 2178 1459Relative error () 036 088 212 173 301 083 505 89 807Manual

M317 265 225 315 32 3325 275 18 135

Application value 1698 2861 2465 3003 3334 3352 2763 2013 1328Relative error () 012 796 956 467 419 081 047 1183 163Manual

M417 2925 25 3025 335 36 305 21 145

Application value 1658 2985 2518 3089 3168 3619 30 1993 1402Relative error () 247 205 072 212 543 053 164 51 331Manual

M516 2925 265 3075 325 325 27 1825 1325

Application value 18 2917 2668 3244 3542 3845 3396 2021 1409Relative error () 125 027 068 55 898 1831 2578 1074 634Manual

M617 3975 2525 31 335 355 3025 205 135

Application value 1706 3567 2621 3194 3238 3548 2741 2187 181Relative error () 035 1026 38 303 334 006 939 668 3407Manual

M7195 305 265 295 4075 4325 4025 24 1925

Application value 1893 3033 255 3308 4215 4385 3957 2441 1561Relative error () 292 056 377 1214 344 139 169 171 1891Manual

M81825 28 24 3025 3375 3375 295 195 1325

Application value 1709 2945 254 2981 3381 3494 2825 1919 1325Relative error () 636 518 583 145 018 353 424 159 0Manual

M9175 2725 24 2875 345 3875 3025 22 1475

Application value 1733 3054 2406 2898 3524 37 2968 2203 1449Relative error () 097 1207 025 08 214 452 188 014 176Manual

M101675 275 235 285 36 365 34 2125 14

Application value 1727 2778 2366 289 3434 3884 2901 1954 1366Relative error () 31 102 068 14 461 641 1468 805 243Manual

M111775 2975 26 3075 32 36 30 195 13

Application value 1746 294 2494 3209 3457 3768 3165 2017 1429Relative error () 163 118 408 436 803 467 55 344 992Average error () 286 439 305 363 476 422 646 63 822

8 Advances in Human-Computer Interaction

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 9: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

developed system considers the vertical blue lines as shownin Figure 12 as the boundaries at the chest level although theactual boundary at the chest level is indicated in yellow linesis error occurs due to the bagginess of the userrsquos shirtresulting in the application measurements being deviatedfrom the manual measurements

Further if the user maintains improper T-pose as shownin Figure 13 the corresponding x coordinates of the chestlevel may take a wider range due to the boundary of the bodyindex image consisting of hands e vertical blue lines inFigure 13 are the boundary at the chest level e horizontalblue line gives the x coordinates corresponding to the chestlevel which is wider than the actual chest measurement

In addition to the above practical errors the Kinectcannot identify black shiny items using an inbuilt IR sensorarray As shown in Figure 14 the user wore a black beltwherein the body index image is identified as the back-ground Due to this reason some measurements can beincorrect even when the user is wearing the required clothsfor the measuring application

5 Conclusions

We used a single RGB-D Kinect sensor to obtain user bodyparameter measurements including 3D measurements suchas perimeters of chest waist hip thigh and knee to developan augmented reality virtual dressing room e developedapplication successfully appends the physics animation tothe dress according to the physical movements made by theuser providing a realistic fitting experience e perfor-mance evaluation reveals that a single depth sensor is ap-plicable in real-time 3D cloth simulation with less than 10of average measurement error

Data Availability

e data that support the findings of this study are openlyavailable at httpsgitlabcomsasadaravdr-documents

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

e authors thank the Department of Physics Faculty ofApplied Sciences University of Sri Jayewardenepura SriLanka for providing facilities for this study and providingfunding under the university grant ASP01RESCI201965Also the authors would like to thank Mr Duleeka Guna-tilake Zone 24x7 Colombo Sri Lanka for fruitfuldiscussion

References

[1] Fitnect Interactive Kit ldquo3D Virtual fitting dressing roommirrorrdquo 2019 httpswwwfitnecthu

[2] I Styku ldquoVirtual fitting room and body scanningrdquo 2019httpswwwstykucom

[3] M Beck and D Crie ldquoI virtually try it I want it VirtualFitting Room a tool to increase on-line and off-line ex-ploratory behavior patronage and purchase intentionsrdquo

Figure 12 e person wearing too baggy shirt and trousers

Figure 13 Incorrect T-pose

Figure 14 e person wearing a pure black color belt

Advances in Human-Computer Interaction 9

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction

Page 10: ApplicabilityofaSingleDepthSensorinReal-Time3DClothes Simulation:AugmentedReality…downloads.hindawi.com/journals/ahci/2020/1314598.pdf · 2019/10/14  · 3.Implementation edevelopedaugmentedreality-basedvirtualfittingroom

Journal of Retailing and Consumer Services vol 40pp 279ndash286 2018

[4] S I A P Diddeniya W K I L WanniarachchiP R S De Silva and N C Ganegoda ldquoEfficient office assistantrobot system autonomous navigation and controlling basedon ROSrdquo International Journal of Multidisciplinary Studies(IJMS) vol 6 no 1 2019

[5] S I A P Diddeniya A M S B Adikari H N GunasingheP R S De Silva N C Ganegoda andW K I L Wanniarachchi ldquoVision based office assistant robotsystem for indoor office environmentrdquo in Proceedings of the3rd International Conference on Information TechnologyResearch Moratuwa Sri Lanka December 2019

[6] K A S V Rathnayake W K I L Wanniarachchi andW H K P Nanavakkara ldquoHuman computer interactionsystem for impaired people by using Kinect motion sensorvoice and gesture integrated smart homerdquo in Proceedings ofthe Second International Conference on Inventive Commu-nication and Computational Technologies Coimbatore IndiaApril 2018

[7] Z Zhang M Zhang Y Chang E S Aziz S K Esche andC Chassapis ldquoCollaborative virtual laboratory environmentswith hardware in the looprdquo in Cyber-Physical Laboratories inEngineering and Science Education pp 363ndash402 SpringerCham Switzerland 2018

[8] Y M Chang E S S Aziz Z M Zhang M M Zhang andS K Esche ldquoUsability evaluation of a virtual educationallaboratory platformrdquo 1e ASEE Computers in Education(CoED) Journal vol 7 no 1 p 24 2016

[9] K Khoshelham and S O Elberink ldquoAccuracy and resolutionof Kinect depth data for indoor mapping applicationsrdquoSensors vol 12 no 2 pp 1437ndash1454 2012

[10] J Ehara and H Saito ldquoTexture overlay for virtual clothingbased on pca of silhouettesrdquo in Proceedings of the 5th In-ternational Symposium on Mixed and Augmented RealityIEEE Computer Society Washington DC USA pp 139ndash142October 2006

[11] D Protopsaltou C Luible M Arevalo-Poizat andN Magnenat-almann ldquoA body and garment creationmethod for an internet-based virtual fitting roomrdquo in Pro-ceedings of Computer Graphics International (CGIrsquo02)Springer Berlin Germany pp 105ndash122 July 2002

[12] C Garcia Martin and E Oruklu ldquoHuman friendly interfacedesign for virtual fitting room Applications on android basedmobile devicesrdquo Journal of Signal and Information Processingvol 3 no 4 pp 481ndash490 2012

[13] S Hauswiesner M Straka and G Reitmayr ldquoVirtual try-onthrough image-based renderingrdquo IEEE Transactions on Vi-sualization and Computer Graphics vol 19 no 9 pp 1552ndash1565 2013

[14] S Giovanni Y C Choi and J Huang ldquoVirtual try-on usingkinect and HD camerardquo Motion in Games Springer Hei-delberg Germany 2012

[15] I Pachoulakis and K Kapetanakis ldquoAugmented realityPlatforms for virtual fitting roomsrdquo1e International Journalof Multimedia amp Its Applications vol 4 no 4 2012

[16] Bodymetrics 2019 httpbodymetricscom[17] Imagine that technologies 2011 httpwww

imaginethattechnologiescom August 2019[18] U Gultepe and U Gudukbay ldquoReal-time virtual fitting with

body measurement and motion smoothingrdquo Computers ampGraphics vol 43 pp 31ndash43 2014

[19] M Kim and C Kim ldquoAugmented reality fashion apparelsimulation using a magic mirrorrdquo International Journal ofSmart Home vol 9 no 2 pp 169ndash178 2015

[20] K Yousef B Mohd and M AL-Omari ldquoKinect-based virtualtry-on system a case studyrdquo in Proceedings of the IEEE JordanInternational Joint Conference on Electrical Engineering andInformation Technology pp 91ndash96 Amman Jordan April2019

[21] A M S B Adikari N G C Ganegoda andW K I L Wanniarachchi ldquoNon-contact human body pa-rameter measurement based on Kinect sensorrdquo IOSR Journalof Computer Engineering vol 19 no 3 pp 80ndash85 2017

[22] JointType Enumeration httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn758662(v3dieb10 August 2019

[23] Coordinate mapping httpsdocsmicrosoftcomen-usprevious-versionswindowskinectdn785530 (vieb10) August 2019

[24] R Synergy ldquoSizing chartrdquo 2019 httpwwwrawsynergycomsizing-chart

[25] M Azure ldquoFace API-facial recognition software | Microsoftazurerdquo 2019 httpsazuremicrosoftcomen-usservicescognitive-servicesface

[26] Z Bhati A Waqas M Karbasi and A W Mahesar ldquoA wireparameter and reaction manager basedBiped character setupand rigging technique in 3Ds Max for animationrdquo Interna-tional Journal of Computer Graphics amp Animation vol 5no 2 pp 21ndash36 2015

[27] Unity Technologies ldquoUnity-manual using humanoidRCharactersrdquo 2019 httpsdocsunity3dcomManualUsingHumanoidCharshtml

[28] A Feng D Casas and A Shapiro ldquoAvatar reshaping andautomatic rigging using a deformable modelrdquo in Proceedingsof the 8th ACM SIGGRAPH Conference on Motion in Games(MIG rsquo15) pp 57ndash64 New York NY USA 2015

[29] I Baran and J Popovic ldquoAutomatic rigging and animation of3D charactersrdquo ACM Transactions on Graphics vol 26 no 3p 72 2007

[30] S Adikari ldquoBatik Sarong 3D model-TurboSquid 1189246rdquo2019 httpswwwturbosquidcomFullPreviewIndexcfmID1189246

10 Advances in Human-Computer Interaction