Please use this identifier to cite or link to this item:
https://idr.l4.nitk.ac.in/jspui/handle/123456789/11098
Title: | E-Var: Enhanced void avoidance routing algorithm for underwater acoustic sensor networks |
Authors: | Nazareth, P. Chandavarkar, B.R. |
Issue Date: | 2019 |
Citation: | IET Wireless Sensor Systems, 2019, Vol.9, 6, pp.389-398 |
Abstract: | Underwater acoustic sensor networks (UASNs) have gained attention among researchers due to its various aquatic applications. On the other hand, UASNs encounter many research challenges due to its inherent characteristics such as high propagation delay, limited bandwidth, high bit-error-rate, limited energy, and communication void during routing. These limitations severely affect the performance of delay-sensitive and reliable applications of UASNs. The primary objective of this study is to address the communication void during routing. Various methods, such as backward forwarding, passive participation, flooding, heuristic, and transmission power adjustments, are proposed to address the communication void during routing. The major drawbacks of these methods are void as a part of routing, loops, unreachable data to the sink, and more number of transmission of duplicate packets. This study proposes a void avoidance routing algorithm referred to as enhanced-void avoidance routing (E-VAR) using an idea of void awareness among the nodes. The E-VAR inhibits the participation of void in routing, thereby resulting in better performance in comparison with the state-of-the-art. Through MATLAB simulations, E-VAR is compared with interference-aware routing and state-of-the-art backward forwarding, in terms of the number of nodes reachable and unreachable due to looping to the sink, average hop-count, and distance. The Institution of Engineering and Technology 2019. |
URI: | http://idr.nitk.ac.in/jspui/handle/123456789/11098 |
Appears in Collections: | 1. Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.