With the development of computer technologies, industrial automation significantly reduces the cost. Cloud computing in industrial networks continues to provide services related to storage and computation by managing a pool of high cost resources. However, to meet specific requirements such as low end-to-end delay, fast response, cloud computing suffers from long delay from cloud data centers to the user end while processing a large amount of data.
On the other hand, due to the distribution of applications and services closer to the end-users, academics and industry experts are now advocating for going from large-centralized cloud computing infrastructures to a range of computing nodes located at the edge of the network. Basically, the fog computing extends the services and resources of thecloud closer to users, which facilitates the leveraging of available services and resources in the edge networks.
Fog computing with the features (e.g., low latency, location awareness and capacity of processing large number of nodes with wireless access) to support heterogeneity and real-time applications is an attractive solution to delay- and resource-constrained industrial networks and intelligent systems. However, with the benefits of fog computing, the research challenges arise regarding fog computing.
For instance, how to handle different protocols and data format from highly dissimilar data sources in fog layer? How to determine which data should be processed in cloud or be processed in fog layer while designing computation offloading strategies? Moreover, the privacy and security mechanisms within fog servers are also key design considerations. Developing sustainable infrastructures, protocols are major research challenges for fog computing.
|