A Review on Fog Computing, Resource Allocation Strategies and its Applications

1S.Harika, Dr.B.Chaitanya Krishna

125 Views
36 Downloads
Abstract:

Fog computing is a paradigm that provides services to user requests at the edge networks. As the definition suggests, the fog computing platform lies between the cloud servers and the users. In a fog-enabled environment, the devices at the fog layer usually perform operations related to networking such as routers, gateways, bridges, and hubs. Researchers envision these devices to be capable of performing both computational and networking operations, simultaneously. Although these devices are resource-constrained compared to the cloud servers, the geological spread and the decentralized nature help in offering reliable services with coverage over a wide area. Further, with fog computing, manufacturers and service providers offer their services at affordable rates. Another advantage of fog computing is the physical location of the devices, which are much closer to the users than the cloud servers. Such placement of the devices reduces operational latency significantly. In particular, Fog Computing refers to a distributed computing infrastructure confined on a limited geographical area within which some Internet of Things (IoT) applications/services run directly at the network edge on smart devices having computing, storage, and network connectivity, named Fog Nodes, with the goal of improving efficiency and reducing the amount of data that needs to be sent to the Cloud for massive data processing, analysis and storage. This paper proposes an efficient strategy on resource allocations and also defines and specifies the applications of fog computing.

Keywords:

A Review on Fog Computing, Resource Allocation Strategies and its Applications

Paper Details
Month5
Year2020
Volume24
IssueIssue 8
Pages9862-9871

Our Indexing Partners

Scilit
CrossRef
CiteFactor