wireless communication architectures based on data...
TRANSCRIPT
Wireless Communication Architectures Based on Data Aggregation for Internal Monitoring of
Large-Scale Wind Turbines
Mohamed A. Ahmed 1 and Young-Chon Kim 2,*
1 Division of Electronics and Information Engineering, Chonbuk National University, Jeonju 561-756, Korea;
E-Mail: [email protected] 2 Department of Computer Engineering, Wind Energy Grid-Adaptive Technology Research Center, Smart
Grid Research Center, Chonbuk National University, Jeonju 561-756, Korea; E-Mail: [email protected]
* Correspondence: [email protected].
Abstract: Wireless networks are regarded as a promising candidate for condition monitoring
systems of large-scale wind turbines due to the design flexibility, easy deployment, and reduced
installation cost. This paper investigates different data aggregation approaches of wireless-based
architectures for the internal monitoring of a large-scale wind turbine. The main objective is to
construct a wireless internal network inside the wind turbine nacelle, for collecting sensing data
from different parts and transmitting the data to a remote control center through a wireless
external network that serves the turbine towers of the wind farm. The proposed wireless network
architectures consist of wireless sensor nodes, coordinator nodes, and a Front-End device. The
designs of the wireless-based architectures involve choices of physical components, sensor types,
sampling rate and data rate. WiFi is a promising technology that is considered for the wind turbine
internal network in this work. Through simulations the network performance is evaluated with
regard to end-to-end delay for different data aggregation approaches of wireless-based
architectures.
Keywords: Wind turbine; Condition monitoring system; Wireless network; WiFi; OPNET
1. Introduction
The installed capacity of wind energy is growing very fast compared with other sources of
renewable energy [1]. This progress in the wind power industry has enabled wind turbines (WTs)
to become more mature, with higher capacities, larger rotor diameters, and taller towers. However,
recent studies have shown that as WTs become larger, the failure rate becomes higher, and thus
more maintenance is needed [2]. There are different condition monitoring solutions that are used
for monitoring WTs such as supervisory control and data acquisition (SCADA), condition
monitoring systems (CMS), and structure health monitoring (SHM). These solutions have been used
for real-time monitoring and control of WTs in order to increase the availability, decrease downtime
and reduce operation and maintenance costs [3]. The SCADA system is designed to provide data
about the operating condition of WTs. It provides the average signal value of approximately 5-10
minutes from sensors such as temperature, vibration, power, current, etc. Compared to CMS,
SCADA systems consider deploying small number of sensor nodes (SNs) with a lower sampling
rate. Furthermore, SCADA does not collect all the necessary information for condition monitoring
purposes [1]. The CMS is designed to provide data about the health of the WT with sampling rates
in the kHz range. It consists of SNs and a data acquisition system. The main functions of CMS
include data acquisition, data transmission and data processing [4]. There are different monitoring
techniques used in WTs such as vibration analysis, acoustic analysis, ultrasonic testing, oil analysis,
strain measurement, and thermography [2]. Regardless the technique used by WTs, the CMS
capacity depends on number of SNs, types of sensors and data processing techniques.
2 of 17
The communication network of a wind farm (WF) can be divided into two parts: the WT
internal network and the WF external network. Both wired-based and wireless-based solutions may
be considered for either the internal or external network. For a wired-based solution, the authors of
[5, 6] carried out the design of a data acquisition and CMS for a WT internal network. The CMS was
installed on an operating WT (Vestas V47). The CMS consists of two main units: the data
acquisition unit in the nacelle and the data storage unit in the tower. Cables are used to connect
between different sensor types and the data acquisition unit. The critical parameters monitored by
the WT internal network were vibration, rotor speed, current, voltage, temperature, wind speed,
wind direction, humidity, and pressure. In [7], a communication network design for an offshore WF
was given that had an Ethernet-based architecture. The authors described the design requirements
and network topology for a large-scale WF (the Great Gabbard wind project) in the United
Kingdom. In [8], a novel CMS was developed and installed at the Yeungheung WF, South Korea for
three WTs supplied by different manufacturers. The communication network for the CMS also has
an Ethernet-based architecture. The CMS has been installed in a WT nacelle and operated
successfully.
For a wireless-based solution, the authors in [9] presented the design of a wireless-based
SCADA system for a WF. In [10], the authors studied the feasibility of applying the broadband
wireless technology using WiMAX for monitoring WFs. The network topology is based on a
wireless mesh configuration. The communication network consists of a communication controller at
each WT, wireless mesh nodes in the WF, and a remote monitoring station at the control center. In
[11], the authors designed a remote monitoring system for WTs, where the internal network
consisted of a ZigBee-based wireless sensor network and was connected to the control center by a
GPRS module. In Ref. [12], the authors implemented a data transmission interface for remote
monitoring and control of a small-scale WT based on ZigBee wireless sensor network. After
long-term practical running, the system proved to be successful and stable. The deployment of
ZigBee supports data rate up to 250 Kbps and lower power consumption. Also, it supports different
network topologies (star, ring and mesh) which could provide coverage range of 10-100m [13]. In
[14], we report that we have recently designed and implemented a condition monitoring and
control system for a small-scale WT. It consists of data collection units, a control unit, and a
coordinator. The data collection units are installed to collect sensing data from different sensors
such as temperature, humidity, pressure, wind direction, wind speed, and vibration. ZigBee-based
wireless is used to communicate between a coordinator and data collection units.
One of the recent developments in CMS for WTs is to design a smart, wireless and energy
efficient sensor network where wired-based solutions cannot be installed [1]. Wireless technology
could provide the vital solution to minimize deployment difficulties and cost, and is more flexible
than wired networks. In this work, WiFi is a promising candidate considered for the WT internal
network. The deployment of wireless local area network (WLAN) provides high speed
point-to-point and point-to-multipoint communication. Also, it offers easier in installation, more
flexibility and less cost. The selection of WiFi for a WT communication network is based on Ref. [15].
The author designed and implemented a lifetime monitoring system for a real WT, where the
monitoring modules in the hub and the nacelle were connected through WLAN (IEEE 802.11 b/g).
The field tests of the prototype installation in the WT worked successfully despite of the harsh
environment. Also, the prototype tests showed that the communication was stable and reliable.
The main contribution of this work is a design for wireless-based architectures for the internal
condition monitoring of a large-scale WT that is based on the International Electrotechnical
Commission (IEC) 61400-25-6 standard. The proposed network model consists of wireless SNs,
coordinator nodes (CNs), and a Front-End (FE) device. Appropriate choices for physical
components, sensor types, sampling rate and data rate have been considered in order to meet the
requirements for aggregating sensing data wirelessly from a WT internal network. We conducted
network modeling and simulation using OPNET Modeler.
2. Communication Network for Wind Turbine
3 of 17
2.1. IEC 61400-25 Standard
A wind farm is composed of three main components: WTs, the electrical system and the
meteorological system. IEC 61400-25 is a standard for the monitoring and control of the WF. It is an
extension of the IEC 61850 standard that used for communication networks in electric substations.
According to the IEC 61400-25-6 ( LN classes and Data classes for Condition Monitoring), WT
condition monitoring is defined as the process of observing WT components or structures for a
period of time to detect early signs of failures [16]. There are different elements of WT condition
monitoring involving vibration, oil debris, temperature, strain gauges and acoustics. Figure 1 shows
the scope of the IEC 61400-25-6 standard, which consists of two main parts: local condition
monitoring and central condition monitoring. The local condition monitoring part is located in the
WT while central condition monitoring part is in the control center.
Figure 1. Condition monitoring and control of WTs based on IEC 61400-25-6.
2.2. Conventional Condition Monitoring System of WT
The conventional local CMS inside a WT consists of SNs, a data acquisition unit, and a local
computer. There are many SNs placed at different parts to measure the critical parameters such as
temperature, vibration, wind speed, wind direction, etc. The SNs periodically generate monitoring
data that are forwarded from data acquisition units to the wind turbine controller (WTC) for
temporary storage. The CMS is located in the turbine nacelle and connected to the main WTC. The
WTC is able to manage a variety of open and proprietary communication protocols such as
Field-bus, industrial Ethernet, etc. In addition, the WTC exchanges control information for pitch
control to change the blade angle and the yaw drivers, based on wind direction and speed. Figure 2
shows the conventional wired-based communication network architecture for the CMS inside a WT.
The WTC is located at the base of the tower, which is connected to an Ethernet Switch for WF
networking. Furthermore, it exchanges data between the WT and the control center [17].
Figure 2. Conventional wired-based communication network architecture for CMS inside WT.
4 of 17
We consider the IEC 61400-25 standard to define the WT logical nodes (LNs) including WROT,
WTEM, WGEN, WTRF, WNAC, WYAW, WTOW and WMET, as given in Table 1. The SNs of a
WT could be classified into different categories such as mechanical measurements (rotor speed,
pitch angle, oil level, temperature, displacement, vibration and torque), electrical measurements
(voltage, current, power, power factor, frequency), meteorological data (wind speed, wind direction,
temperature, humidity, pressure) and status information. Table 2 shows the detailed types of data
and attributes of the rotor. In this work, we defined the sensors types and number of SNs for the
WT based on the IEC 61400-25-2 standard.
Table 1. Logical nodes of a wind turbine.
LN Classes Description
WROT Wind turbine rotor information
WTRM Wind turbine transmission information
WGEN Wind turbine generator information
WCNV Wind turbine converter information
WTRF Wind turbine transformer information
WNAC Wind turbine nacelle information
WYAW Wind turbine yawing information
WTOW Wind turbine tower information
WMET Wind power plant meteorological information
Table 2. Types of data and attributes of the rotor (WROT).
Data Attribute Name Explanation
Analogue information
RotSpd
RotPos
HubTmp
PtHyPresBl
PtAngValBl
Rotor speed at rotor side
Angular rotor position
Temperature in the rotor hub
Pressure of hydraulic pitch system for blade
Pitch angle for blade
Status information RotSt
BlStBl
PtCtlSt
Status of rotor
Status of blade
Status of pitch control
Control information BlkRot
PtEmgChk
Set rotor to blocked position
Check emergency pitch system
3. Proposed Wireless-Based Architectures for Wind Turbine
3.1. Wireless Network Architecture
The proposed wireless-based architecture for WF consists of two levels: WT internal network
and WF external network, as shown in figure 3. There are three main parts of a WT: nacelle, tower,
and foundation. The CMS is located in the WT nacelle for collecting sensing data from different
parts. The WT internal network consists of many SNs. The SNs are connected to different WT parts
such as the gearbox, generator, transformer, etc. The internal network covers a small network size,
where the main function of this level is local monitoring inside the WT. A wireless FE device is
located in the nacelle for collecting and processing the WT monitoring data. Furthermore, it
communicates with the control center through a wireless external network. The WT internal
network could be implemented by using conventional wired technology (Ethernet) or wireless
technology (WiFi, ZigBee, Bluetooth).
5 of 17
Figure 3. Proposed wireless-based communication network architecture for WF.
The main function of WF external network level is remote monitoring of the WF. It enables the
control center to exchange monitoring data and control information remotely with WTs. The WF
external network requires longer range communication between WTs and the control center,
compared to the distances over which the internal network operates. The function of the control
center is to store monitoring data which are categorized and stored in servers. In the proposed
wireless-based solution, a base station is located at the control center. The base station enables the
exchange of uplink monitoring data (from WTs to the control center) and downlink control data
(from the control center to WTs). There are two types of data transmission for the WF external
network, namely direct transmission (single hop) and multi-hop transmission (multiple hops). In
direct transmission configuration, WTs transmit data directly to the base station of the control
center (point-to-multipoint topology). This configuration is suitable for WFs with a small number of
WTs. However, with a large number of WTs, the link between WTs and the base station may
become congested. In the multi-hop transmission configuration, WTs transmit data to the base
station of the control center using multiple hops. In uplink and downlink, intermediate nodes called
relay nodes are operated as aggregators and relays, which are used to connect WTs and the control
center. The multi-hop transmission reduces the number of direct links from WTs to the base station
and provides additional coverage compared to a single hop transmission [10].
3.2. Network Model and Data Aggregation Approach
Figure 4 shows a schematic diagram illustrating a WT as a cyber-physical system. It consists of
two parts: the physical level and the network level. Based on the IEC 61400-25-6 standard, the WT
physical level consists of nine LNs: rotor (WROT), transmission (WTRM), generator (WGEN),
converter (WCNV), transformer (WTRF), nacelle (WNAC), yaw (WYAW), tower (WTOW), and
meteorological (WMET) as shown in equation (1).
WTLN = {WROT, WTRM, WGEN, WCNV, WTRF, WNAC, WYAW, WTOW, WMET} (1)
The network level comprises three different types of nodes: SNs, CNs, and a FE device. The
main function of SN is sensing while the function of CN is data aggregation from wireless sensor
nodes through single-hop or multi-hops. The FE device aggregates data from CNs. Also, it
communicates with the control center through a wireless external network. Each WT part is
considered to be represented by a LN as shown in figure 4, which nodes involve different types of
SNs.
6 of 17
Figure 4. Schematic diagram illustrating WT as a cyber-physical system.
A sensor node SNi can be identified by the physical location (LN), sensor type (ST) and sensor
identifier (SID) using equation (2).
𝑆𝑁𝑖 = {𝐿𝑁, 𝑆𝑇 , 𝑆𝐼𝐷} (2)
There are several types of SNs inside WT such as temperature (Tmp), speed (Spd), pressure
(Pres), torque (Torq), voltage (V), current (A), power factor (PF), etc. The types of sensors ST(SNi)
can be defined as in (3).
ST(SNi) ⊆ {Tmp, Spd, Pres, Torq, V, A, PF, … . } (3)
In order to identify SNi, each sensor has a unique identifier (SID) where T is the total number of
sensors, as given in equation (4).
𝑆𝐼𝐷(𝑆𝑁𝑖) = 𝑖 , 𝑖 ∈ 𝑇 (4)
We define the sampling rate and number of channels in order to calculate the data size
generated from each sensor node. The sample size is assumed 2 bytes (16 bits) based on [18]. Data
rate of the application layer can be calculated according to equation (5).
𝐷𝑎𝑡𝑎 𝑟𝑎𝑡𝑒 = 2 × 𝑁𝑐 × 𝑓𝑠 (5)
where Nc and fs represent the number of channels and sampling frequency, respectively.
We considered different configurations for aggregating sensing data wirelessly from the WT
internal network. The four approaches for data aggregations inside the WT are those based on the
physical components, the sensor types, the sampling frequency and the data rate. Data aggregation
aims to collect and forward sensing data from different SNs toward the FE device. The physical SNs
are installed inside the WT. We used the concept of virtual nodes (VNs) to organize the physical
SNs into multiple groups, as shown in figure 5.
7 of 17
Figure 5. Mapping of physical sensor devices to the logical network counterpart.
3.2.1. Case (1): Data Aggregation Based on Physical Components
There are nine LNs representing nine parts of the WT based on the IEC 61400-25-6 standard.
Each LN consists of one CN and many SNs. There is a direct communication between the SNs and
the CN for collecting sensing data. In addition, there is a direct communication between the nine
CNs and the FE unit, as shown in figure 6.
Figure 6. Data aggregation based on the physical components.
3.2.2. Case (2): Data Aggregation Based on Data Rate
A WT consists of three LNs and one virtual node (VN). The three LNs represent heavy data
traffic inside WT which are WGEN, WCNV, and WTRF. The VN represents the aggregation among
six LNs with light data traffic, which are WROT, WTRM, WNAC, WYAW, WTOW, and WMET.
Figure 7 shows the data aggregation based on the data rate, where there is a direct communication
between the four CNs and the FE unit.
Figure 7. Data aggregation based on the data rate.
8 of 17
3.2.3. Case (3): Data Aggregation Based on Sampling Rate
The measurements are classified based on the sampling rate into eight (VNs) as follow: 1 Hz
(temperature, power factor, humidity, oil level, and status), 3 Hz (speed, pitch angle, wind direction,
and wind speed), 5 Hz (power), 10 Hz (displacement and frequency), 50 Hz (torque), 100 Hz
(pressure), 200 Hz (vibration) and 2048 Hz (voltage and current). Also, there is a direct
communication between the eight VNs and the FE unit, as shown in figure 8.
Figure 8. Data aggregation based on the sampling frequency.
3.2.4. Case (4): Data Aggregation Based on Sensor Type
There are different types and number of SNs. The measurements are classified based on sensor
types into 17 VNs as follows: temperature (16 SNs), speed (3 SNs), pressure (7 SNs), wind direction
(3 SNs), wind speed (3 SNs), pitch angle (6 SNs), vibration (2 SNs), voltage (12 SNs), power (2 SNs),
power factor (2 SNs), current (6 SNs), oil level (4 SNs), frequency (1 SN), displacement (2 SNs),
torque (1 SN), humidity (3 SNs), and status information (29 SNs). In this case, there is a direct
communication between the 17 VNs and the FE unit, as shown in figure 9. Table 3 compares
different data aggregation approaches for the WT internal network and Table 4 shows the
measurement requirements for different SNs inside the WT.
Figure 9. Data aggregation based on the sensor type.
Table 3. Different configurations of data aggregation for WT internal network.
Data Data Aggregation LN VN SN CN
Case (1) Physical Level 9 - 102 9
Case (2) Data Rate 3 1 102 4
Case (3) Sampling Rate - 8 102 8
Case (4) Sensor Type - 17 102 17
9 of 17
Table 4. Measurement requirement for sensor data in WT.
Measurement Data transmission
(bytes/sec)
Number of
SNs
Total Traffic
(bytes/sec)
Temperature 2 16 32
Speed 6 3 18
Pressure 200 7 1,400
Pitch Angle 6 6 36
Vibration 1200 2 2,400
Voltage 12288 12 147,456
Current 12288 6 73,728
Power 10 2 20
Power Factor 2 2 4
Humidity 2 3 6
Wind Direction 6 3 18
Wind Speed 6 3 18
Displacement 40 2 80
Oil Level 2 4 8
Frequency 20 1 20
Torque 300 1 300
Status 2 29 58
Total 102 225,602
From the practical point of view, data aggregation could be implemented by different
wired/wireless technologies. Wired connections can be used to collect measurements from SNs and
transmit the data to the FE unit. Also, wireless technology can be used to collect measurements
from sensor nodes using a wireless access point (AP). Given wired-based solution as an example,
the authors in [5] carried out the design of a data acquisition and CMS for a WT internal network
where wired connections have been used to connect between different sensor types and the data
acquisition unit. The parameters were divided into two groups based on the sampling rate: low
speed (50 Hz) and high speed (20 kHz). The same concept has been used in Case (3) of our work for
data aggregation based on the sampling rate.
4. Network Modeling and Simulation
4.1. Assumptions
A set of real WT dimensions was used to build our network model. The dimensions are related
to an offshore WT (HQ 5500) prototype that was installed on Jeju Island (South Korea) in Feb. 2014.
The WT capacity is 5.5 MW, which is the biggest turbine size installed in South Korea. The hub
height is 100 m and the rotor diameter is 140 m. The network configuration in OPNET considers a
field of 200 m x 200 m. The tower height is 100 m, the nacelle dimension is 12 m x 4 m, and the rotor
diameter is 140 m [19]. Simulation assumptions are given in Table 5.
10 of 17
Table 5. Simulation assumption.
Parameter Value
Nacelle Area 12 m × 4 m
Tower Height 100 m
Technology WiFi
WiFi data rate 54, 24, 11 Mbps
Simulation time 10 min
IEC technical committee (TC 88) has published the standard IEC 61400-25, which comprises six
parts. Part 4 (IEC 61400-25-4) presents the conceptual architecture of multiple mapping for the
communication profile, as shown in figure 10. Both physical and link layers are out the scope of the
standard. The transmission control protocol/internet protocol (TCP/IP) is the basic protocols
supported by all mapping profiles [20]. Different communication network technologies
(wired/wireless) could be considered. The focus of this research work is on IEEE 802.11 (WiFi) using
the OPNET simulation.
Figure 10. Illustration of protocol stack (a) Communication profiles based on IEC 61400-25-4 (b)
Protocol stack of FTP.
A WT is an autonomous system which can continue to work even if the communication link
between the WT and the control center is not available. As explained in Section 2, each WT is
equipped with a WTC that handles and stores data from SNs. Data packets are received at the WTC
at different time interval such as 100 ms, 1 s and 10 s. The WTC processes the sensing data for
internal control purposes and provides the data to the SCADA system. The measurements are
cached and transmitted in bulk with a different frequency. The updated rate is at least once per
second, and the maximum transfer rate is based on channel capacity and network technology [21,
22].
The IEC 61400-25 standard does not provide any specific communication timing requirements
for WT communication network. In this work, we considered the communication timing
requirements for electric substation automation based on IEEE 1646 standard, as shown in Table 6
[23]. The focus of this work is the WT internal network (between SNs and the WTC). The WF
11 of 17
external network (the connection between the WT and the control center) is out the scope of this
work.
Table 6. Communication timing requirements for different applications based on IEEE 1646
standard.
Information type Internal External
Monitoring and control 16 ms 1 s
Protection 4 ms 8–12 ms
Operation and maintenance 1 ms 10 s
Figure 11. OPNET model of a sensor node based on Wi-Fi.
Figure 11 shows the internal structure of a wireless SN (wlan_wkstn_adv) used for WiFi-based
architectures. The application layer is concerned with SNs information such as message type,
message size and data interval. The file transfer protocol (FTP) is selected to transfer the data
between SNs and the FE unit.
4.2. WiFi-based Internal Network for WT
Figure 12 shows the WiFi-based architecture of the WT internal network using 102 SNs. Four
different scenarios are configured for data aggregation. In OPNET, the network model is
implemented using wireless workstations (wlan_wkstn_adv) and access points
(wlan_ethernet_slip4_adv). Figure 13 shows the network configuration of WiFi-based architecture for
Case (1). The communication network is configured in an infrastructure mode which consists of
nine basic service sets (BSSs). The FE unit is connected to nine access points (APs) using wired links
(Fast Ethernet). The following metrics have been considered for the performance evaluation:
Server FTP Traffic received (bytes/sec): it represents the average bytes per second forwarded
to all FTP applications by the transport layers in the server node.
End-to-end (ETE) delay: is the amount of time (in seconds) for data to be delivered from source
(SNs) to the destination (FE device) along the communication path.
Wireless LAN delay (sec): it represents the end to end delay of all the packets received by the
wireless LAN MACs of all WLAN nodes in the network and forwarded to the higher layer. This
12 of 17
delay includes medium access delay at the source MAC, reception of all the fragments
individually, and transfers of the frames via AP, if access point functionality is enabled.
Wireless LAN data dropped (bits/sec): it represents the total size of higher layer data packets
(in bits/sec) dropped by all the WLAN MACs in the network.
Figure 12. WiFi-based OPNET model of WT internal network (No aggregation - 102 sensor nodes).
Figure 13. WiFi-based OPNET model of WT internal network for Case 1: 9LNs.
4.2.1. Traffic Received Results
The data traffic for the WiFi-based network architecture for case 1 and case 2 are given as
examples, as shown in Table 7 and Table 8, respectively. Each SN is assigned a BSS identifier to
transmit the traffic profile to the allocated AP. Both the SN and the AP are configured with the
same BSS-ID to ensure that the sensors data from different logical/virtual nodes do not interfere
with each other. The simulation results showed that the received amount of traffic at the FE unit
agrees with our calculation for all cases. The received traffic from different sensor nodes such as
current, displacement, frequency and humidity are shown in figure 14a. The initial peak in the
curves is due to network initialization (for approximately 12 second). When the network goes into
the steady-state, the received traffic is stable. The total traffic is approximately 225,602 bytes/sec, as
shown in figure 14b.
13 of 17
Table 7. Data traffic for WiFi-based architecture (Case 1).
WT Internal
Network
Number
of SNs BSS-ID
Total Traffic
(bytes/sec)
Case (1)
9 LNs
WROT 16 1 642
WTRM 3 2 2,828
WGEN 7 3 73,764
WCNV 6 4 74,060
WYAW 2 5 220
WTOW 12 6 8
WNAC 6 7 112
WTRF 2 8 73,740
WMET 2 9 228
Total 102 225,602
Figure 14. Amount of received traffic for WiFi-based architecture. (a) Individual SNs in cases 1 (b)
Total traffic received in case 1.
Table 8. Data traffic for WiFi-based architecture (Case 2).
WT Internal
Network
Number
of SNs BSS-ID
Total Traffic
(bytes/sec)
Case (2)
3 LNs, 1 VN
WGEN 14 3 73,764
WCNV 14 4 74,060
WTRF 12 8 73,740
6-LNs-Data 62 1 4,038
Total 102 225,602
14 of 17
4.2.2. Delay Results
The performance of WT internal network (without data aggregation) is evaluated in view of
ETE delay in case of direct data transmission from SNs to the FE device. The SNs are modeled by
wireless workstations and the FE device is modeled by a wireless server. Figure 15 compares the
wireless ETE delay of 102 SNs using different data rates of 54, 24 and 11 Mbps. As expected, the
higher the data rate, the lower the ETE delay. At 11 Mbps, the maximum ETE delay is
approximately 96.151 ms, while it is approximately 40.763 ms at 54 Mbps. At 24 Mbps, the
maximum and average ETE delay are 49.409 ms and 46.621 ms, respectively. Table 9 lists the
maximum and average ETE delay for different data rates.
Figure 15. ETE delay of WiFi-based architecture with 102 SNs (no aggregation).
Table 9. ETE delay for different WiFi-based architectures (No aggregation).
WiFi-based Internal Network Maximum Average
11 Mbps 96.151 ms 92.425 ms
24 Mbps 49.409 ms 46.621 ms
54 Mbps 40.763 ms 38.590 ms
We configured four scenarios for different data aggregation using a data rate of 54 Mbps. The
WT internal network comprises nine APs in Case (1), four APs in Case (2), eight APs in Case (3),
and 17 APs in Case (4). Figure 16 shows the ETE delay for different WiFi-based architectures. Case
(1) achieves the lowest ETE delay compared with other scenarios. The maximum and average ETE
delay of Case (1) are 14.834 ms and 13.694 ms, respectively. Case (3) achieves the highest ETE delay.
The maximum and average ETE delay of Case (2) are 26.986 ms and 24.735 ms, respectively. Figure
17 shows the data dropped in bps for WLAN versus the simulation time. For the direct data
transmission scenario, a higher data drop rate is due to buffer overflow compared to other
aggregation scenarios. If large numbers of SNs are connected directly to an AP or the FE device, the
problem of congestion, collisions and buffer overflow will occur and delay will be increased. Table
10 lists the maximum and average ETE delay for different network configurations. Considering the
communication timing requirements of monitoring and control given in Table 6, the network model
of Case (1) satisfies the timing requirements with ETE delay of less than 16 ms. No data packets
were dropped due to buffer overflow for cases 1 and 4.
15 of 17
The results in Figure 17 show a high data drop rate due to buffer overflow for the direct data
transmission (no aggregation scenario) compared to other aggregation scenarios. When there is a
buffer overflow, we would expect to see latency increase. In Case (1) and Case (4), no data packets
were dropped due to buffer overflow. This can be confirmed by comparing the simulation results of
Figure 17 with the E2E latencies in Figure 16. Case 2 and Case 3 have more data dropped due to
buffer overflow which results in a high ETE delay.
Figure 16. ETE delay of WiFi-based architecture for Cases 1, 2, 3 and 4.
Figure 17. Data dropped due to buffer overflow of WiFi-based architecture for Cases 1, 2, 3 and 4.
Table 10. ETE delay for different WiFi-based architectures (54 Mbps).
WiFi-based Internal Network Maximum Average
Case (1) 9 LNs 14.834 ms 13.694 ms
Case (2) 4 VNs 26.986 ms 24.735 ms
Case (3) 8 VNs 28.762 ms 26.854 ms
Case (4) 17 VNs 19.558 ms 18.231 ms
16 of 17
5. Conclusions
In this paper, we investigated the design, simulation and evaluation of different data
aggregation approaches of wireless-based architectures for a WT internal network. The proposed
wireless network consists of SNs, CNs and a FE device. WiFi is a promising technology considered
in this work. The maximum ETE delay was approximately 96.151 ms for a data rate of 11 Mbps, and
approximately 40.763 ms for a data rate of 54 Mbps. Four approaches have been considered for data
aggregation inside the WT, based on physical components, sensor types, sampling rate and data
rate. Case (1) achieves the lowest ETE delay compared with other data aggregation approaches. The
maximum and average ETE delay of Case (1) are 14.834 ms and 13.694 ms, respectively. A
significant reduction in ETE delay has been achieved using data aggregation approaches compared
with direct wireless data transmission.
Acknowledgments: This work was supported by the Brain Korea 21 PLUS and the National Research
Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (2010-0028509).
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Takoutsing, P.; Wamkeue, R.; Ouhrouche, M.; Slaoui-Hasnaoui, F.; Tameghe, T.; Ekemb, G. Wind Turbine
Condition Monitoring: State-of-the-Art Review, New Trends, and Future Challenges. Energies 2014, 7,
2595–2630.
2. García Márquez, F. P.; Tobias, A. M.; Pinar Pérez, J. M.; Papaelias, M. Condition monitoring of wind
turbines: Techniques and methods. Renew. Energy 2012, 46, 169–178.
3. Singh, B. K.; Coulter, J.; Sayani, M. A. G.; Sami, S. M.; Khalid, M.; Tepe, K. E. Survey on communication
architectures for wind energy integration with the smart grid. Int. J. Environ. Stud. 2013, 70, 765–776.
4. Smarsly, K.; Kincho, H.; Hartmann, D. A cyber infrastructure for integrated monitoring and life-cycle
management of wind turbines. In 20th International Workshop on Intelligent Computing, Vienna,
Austria, vol. 6, no. 30, pp. 1–10, 2013.
5. Ferguson, D.; Catterson, Y. M.; Booth, C.; Cruden, A. Designing wind turbine condition monitoring
systems suitable for harsh environments. In 2nd IET Renewable Power Generation Conference (RPG 2013);
Institution of Engineering and Technology, 2013; Vol. 51, pp. 1–4.
6. Swiszcz, G.; Cruden, A.; Booth, C.; Leithead, W. A data acquisition platform for the development of a
wind turbine condition monitoring system. In 2008 International Conference on Condition Monitoring and
Diagnosis; IEEE, 2008; Vol. 51, pp. 1358–1361.
7. Goraj, M.; Epassa, Y.; Midence, R.; Meadows, D. Designing and deploying Ethernet networks for offshore
wind power applications - a case study. In 10th IET International Conference on Developments in Power System
Protection (DPSP 2010). Managing the Change; IET, 2010; Vol. 51, pp. 84–84.
8. Park, J. Y.; Kim, B. J.; Lee, J. K. Development of condition monitoring system with control functions for
wind turbines. Proc. World Acad. Sci. Eng. Technol. 2011, 81, 286–291.
9. Meng, Y.; Gong, W. Design of SCADA System Based on Wireless Communication for Offshore Wind Farm.
In; Qian, Z.; Cao, L.; Su, W.; Wang, T.; Yang, H., Eds.; Springer Berlin Heidelberg: Berlin, Heidelberg, 2012;
pp. 347–352.
10. Zheng, G.; Xu, H.; Wang, X.; Zou, J. Applications of WiMAX-based wireless mesh network in monitoring
wind farms,” Int. Journal of Networking and Virtual Organisations, vol. 7, no. 6, pp. 535–548, 2010.
11. Song, Y.; Wang, B.; Li, B.; Zeng, Y.; Wang, L. Remotely monitoring offshore wind turbines via ZigBee
networks embedded with an advanced routing strategy. J. Renew. Sustain. Energy 2013, 5, 013110.
12. Hsu, C-L; Wu, W-B. The practical design of constructing data transition interface with ZigBee WSN and
RS-485 wired interface — example with small-scaled wind-power electricity generator system. Journal of
Software, vol. 3, no. 8, pp. 49–56, 2008.
13. Parikh, P. P.; Kanabar, M. G.; Sidhu, T. S. Opportunities and challenges of wireless communication
technologies for smart grid applications. In 10th IEEE Power and Energy Society General Meeting,
Minneapolis, USA, pp. 1–7, 2010.
14. Kang, K.-Y.; Ahmed, M.A.; Kim, Y.-C. Implementation of condition monitoring and control system for
small-scale wind turbines. 40th Annual Conf. of IECON, Dallas, USA, pp. 2122–2127, 2014.
17 of 17
15. Nilsson, L. Lifetime monitoring of wind turbines. Master Thesis, Dept. of Automatic Control, Lund
University, Nov. 2005.
16. IEC 61400-25-6, International Standard, Wind Turbines—Part 25-6: Communications for Monitoring and
Control of Wind Power Plants—LN classes and data classes for condition monitoring, 2010.
17. Pettener, A. L. SCADA and communication network for large scale offshore wind power systems. IET
Conference on Renewable Power Generation, RPG, Edinburgh, pp. 1–6, 2011.
18. Wijetunge, S.; Gunawardana, U.; Liyanapathirana, R. Wireless sensor networks for structure health
monitoring: considerations for communication protocol design. In 17th ICT International Conference on
Telecommunication, Doha, Qatar, pp. 694–699, 2010.
19. HQ 5500 5.5 MW offshore wind turbine, New Horizons Spring Magazine, pp. 24–27, 2014.
20. IEC 61400-25-4, International Standard, Wind Turbines—Part 25-4: Communications for Monitoring and
Control of Wind Power Plants—Mapping to Communication Profile, 2008.
21. Johnsson, A.; Svensson, J. Wind Power Communication–Design and Implementation of Test Environment
for IEC61850/UCA2. Elforsk Rapport 02:16. Elforsk AB: Stockholm, Sweden, 2002.
22. ENERCON SCADA System, Product description, pp. 1-25, 2006. Available online:
http://planning.allerdale.gov.uk /portal/servlets/AttachmentShowServlet?ImageName=170578.
23. Khan, R.H.; Khan, J.Y. A comprehensive review of the application characteristics and traffic requirements
of a smart grid communications network. Comput. Netw. 2013, 57, 825–845.